Search Results

Search found 1627 results on 66 pages for 'scenarios'.

Page 8/66 | < Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >

  • Error Using 32 vs. 64 bit SharePoint 2007 DLLs with PowerShell

    - by Brian Jackett
    Next time you fire up PowerShell to work with the SharePoint API make sure you launch the proper bit version of PowerShell.  Last week I had an interesting error that led to this blog post.  Travel back in time a little bit with me to see where this 32 vs. 64 bit debate started. History     Ever since the first pre-beta bits of Office 2010 landed in my lap I have been questioning whether it’s better to run 32 or 64 bit applications on a 64 bit host operating system.  In relation to Office 2010 I heard a number of arguments for 32 bit including this link from the Office 2010 Engineering team.  Given my typical usage scenarios 32 bit seemed the way to go since I wasn’t a “super RAM hungry” Excel user or the like. The Problem     Since I had chosen 32 bit Office 2010, I tried to stick with 32 bit version of other programs that I run assuming the same benefits and rules applied to other applications.  This is where I was wrong.  Last week I was attempting to use 32 bit PowerShell ISE (Integrated Scripting Environment) on a 64 bit WSS 3.0 server.  When trying to reference the 64 bit SharePoint DLLs I got the following errors about not being able to find the web application.     I have run into these errors when I have hosts file issues or improper permissions to the farm / site collection but these were not the case.  After taking a quick spin around the interwebs I ran across the below forum post comment and another MSDN forum reply that explained the error.  Turns out that sometimes it’s not possible to run 32 bit applications against a 64 bit OS / farm / assembly / etc. …the problem could also be because your SharePoint is 64-Bit but your app is running in 32-bit mode     I quickly exited 32 bit PowerShell ISE and ran the same code under 64 bit PowerShell ISE.  All errors were gone and the script ran successfully.   Conclusion     The rules of 32 vs. 64 bit interoperability do not always apply evenly across all applications and scenarios.  In my case I wasn’t able to run 32 bit PowerShell against 64 bit SharePoint DLLs.  I’m updating all of my links and shortcuts to use 64 bit PowerShell where appropriate.  I’m quite surprised it has taken me this long to run into this error, but sometimes blind luck is all that keeps you from running into errors.  Lesson learned and hopefully this can benefit you as well.  Happy SharePointing all!         -Frog Out   Links http://blogs.technet.com/b/office2010/archive/2010/02/23/understanding-64-bit-office.aspx http://social.msdn.microsoft.com/Forums/en-US/sharepointdevelopment/thread/a732cb83-c2ef-4133-b04e-86477b72bbe3/ http://stackoverflow.com/questions/266255/filenotfoundexception-with-the-spsite-constructor-whats-the-problem

    Read the article

  • Improve Performance of char.IsWhiteSpace for ASCII inputs in .NET 3.5

    - by Tanzim Saqib
    IsNullOrWhiteSpace is a new method introduced in string class in .NET 4.0. While this is a very useful method in string based processing, I attempted to implement it in .NET 3.5 using char.IsWhiteSpace() . I have found significant performance penalty using this method which I replaced later on, with my version. The following code takes about 20.6074219 seconds in my machine whereas my implementation of char.IsWhiteSpace takes about 1/4 less time 15.8271485 seconds only. In many scenarios ex. string...(read more)

    Read the article

  • An XEvent a Day (28 of 31) – Tracking Page Compression Operations

    - by Jonathan Kehayias
    The Database Compression feature in SQL Server 2008 Enterprise Edition can provide some significant reductions in storage requirements for SQL Server databases, and in the right implementations and scenarios performance improvements as well.  There isn’t really a whole lot of information about the operations of database compression that is documented as being available in the DMV’s or SQL Trace.  Paul Randal pointed out on Twitter today that sys.dm_db_index_operational_stats() provides...(read more)

    Read the article

  • The Sensemaking Spectrum for Business Analytics: Translating from Data to Business Through Analysis

    - by Joe Lamantia
    One of the most compelling outcomes of our strategic research efforts over the past several years is a growing vocabulary that articulates our cumulative understanding of the deep structure of the domains of discovery and business analytics. Modes are one example of the deep structure we’ve found.  After looking at discovery activities across a very wide range of industries, question types, business needs, and problem solving approaches, we've identified distinct and recurring kinds of sensemaking activity, independent of context.  We label these activities Modes: Explore, compare, and comprehend are three of the nine recognizable modes.  Modes describe *how* people go about realizing insights.  (Read more about the programmatic research and formal academic grounding and discussion of the modes here: https://www.researchgate.net/publication/235971352_A_Taxonomy_of_Enterprise_Search_and_Discovery) By analogy to languages, modes are the 'verbs' of discovery activity.  When applied to the practical questions of product strategy and development, the modes of discovery allow one to identify what kinds of analytical activity a product, platform, or solution needs to support across a spread of usage scenarios, and then make concrete and well-informed decisions about every aspect of the solution, from high-level capabilities, to which specific types of information visualizations better enable these scenarios for the types of data users will analyze. The modes are a powerful generative tool for product making, but if you've spent time with young children, or had a really bad hangover (or both at the same time...), you understand the difficult of communicating using only verbs.  So I'm happy to share that we've found traction on another facet of the deep structure of discovery and business analytics.  Continuing the language analogy, we've identified some of the ‘nouns’ in the language of discovery: specifically, the consistently recurring aspects of a business that people are looking for insight into.  We call these discovery Subjects, since they identify *what* people focus on during discovery efforts, rather than *how* they go about discovery as with the Modes. Defining the collection of Subjects people repeatedly focus on allows us to understand and articulate sense making needs and activity in more specific, consistent, and complete fashion.  In combination with the Modes, we can use Subjects to concretely identify and define scenarios that describe people’s analytical needs and goals.  For example, a scenario such as ‘Explore [a Mode] the attrition rates [a Measure, one type of Subject] of our largest customers [Entities, another type of Subject] clearly captures the nature of the activity — exploration of trends vs. deep analysis of underlying factors — and the central focus — attrition rates for customers above a certain set of size criteria — from which follow many of the specifics needed to address this scenario in terms of data, analytical tools, and methods. We can also use Subjects to translate effectively between the different perspectives that shape discovery efforts, reducing ambiguity and increasing impact on both sides the perspective divide.  For example, from the language of business, which often motivates analytical work by asking questions in business terms, to the perspective of analysis.  The question posed to a Data Scientist or analyst may be something like “Why are sales of our new kinds of potato chips to our largest customers fluctuating unexpectedly this year?” or “Where can innovate, by expanding our product portfolio to meet unmet needs?”.  Analysts translate questions and beliefs like these into one or more empirical discovery efforts that more formally and granularly indicate the plan, methods, tools, and desired outcomes of analysis.  From the perspective of analysis this second question might become, “Which customer needs of type ‘A', identified and measured in terms of ‘B’, that are not directly or indirectly addressed by any of our current products, offer 'X' potential for ‘Y' positive return on the investment ‘Z' required to launch a new offering, in time frame ‘W’?  And how do these compare to each other?”.  Translation also happens from the perspective of analysis to the perspective of data; in terms of availability, quality, completeness, format, volume, etc. By implication, we are proposing that most working organizations — small and large, for profit and non-profit, domestic and international, and in the majority of industries — can be described for analytical purposes using this collection of Subjects.  This is a bold claim, but simplified articulation of complexity is one of the primary goals of sensemaking frameworks such as this one.  (And, yes, this is in fact a framework for making sense of sensemaking as a category of activity - but we’re not considering the recursive aspects of this exercise at the moment.) Compellingly, we can place the collection of subjects on a single continuum — we call it the Sensemaking Spectrum — that simply and coherently illustrates some of the most important relationships between the different types of Subjects, and also illuminates several of the fundamental dynamics shaping business analytics as a domain.  As a corollary, the Sensemaking Spectrum also suggests innovation opportunities for products and services related to business analytics. The first illustration below shows Subjects arrayed along the Sensemaking Spectrum; the second illustration presents examples of each kind of Subject.  Subjects appear in colors ranging from blue to reddish-orange, reflecting their place along the Spectrum, which indicates whether a Subject addresses more the viewpoint of systems and data (Data centric and blue), or people (User centric and orange).  This axis is shown explicitly above the Spectrum.  Annotations suggest how Subjects align with the three significant perspectives of Data, Analysis, and Business that shape business analytics activity.  This rendering makes explicit the translation and bridging function of Analysts as a role, and analysis as an activity. Subjects are best understood as fuzzy categories [http://georgelakoff.files.wordpress.com/2011/01/hedges-a-study-in-meaning-criteria-and-the-logic-of-fuzzy-concepts-journal-of-philosophical-logic-2-lakoff-19731.pdf], rather than tightly defined buckets.  For each Subject, we suggest some of the most common examples: Entities may be physical things such as named products, or locations (a building, or a city); they could be Concepts, such as satisfaction; or they could be Relationships between entities, such as the variety of possible connections that define linkage in social networks.  Likewise, Events may indicate a time and place in the dictionary sense; or they may be Transactions involving named entities; or take the form of Signals, such as ‘some Measure had some value at some time’ - what many enterprises understand as alerts.   The central story of the Spectrum is that though consumers of analytical insights (represented here by the Business perspective) need to work in terms of Subjects that are directly meaningful to their perspective — such as Themes, Plans, and Goals — the working realities of data (condition, structure, availability, completeness, cost) and the changing nature of most discovery efforts make direct engagement with source data in this fashion impossible.  Accordingly, business analytics as a domain is structured around the fundamental assumption that sense making depends on analytical transformation of data.  Analytical activity incrementally synthesizes more complex and larger scope Subjects from data in its starting condition, accumulating insight (and value) by moving through a progression of stages in which increasingly meaningful Subjects are iteratively synthesized from the data, and recombined with other Subjects.  The end goal of  ‘laddering’ successive transformations is to enable sense making from the business perspective, rather than the analytical perspective.Synthesis through laddering is typically accomplished by specialized Analysts using dedicated tools and methods. Beginning with some motivating question such as seeking opportunities to increase the efficiency (a Theme) of fulfillment processes to reach some level of profitability by the end of the year (Plan), Analysts will iteratively wrangle and transform source data Records, Values and Attributes into recognizable Entities, such as Products, that can be combined with Measures or other data into the Events (shipment of orders) that indicate the workings of the business.  More complex Subjects (to the right of the Spectrum) are composed of or make reference to less complex Subjects: a business Process such as Fulfillment will include Activities such as confirming, packing, and then shipping orders.  These Activities occur within or are conducted by organizational units such as teams of staff or partner firms (Networks), composed of Entities which are structured via Relationships, such as supplier and buyer.  The fulfillment process will involve other types of Entities, such as the products or services the business provides.  The success of the fulfillment process overall may be judged according to a sophisticated operating efficiency Model, which includes tiered Measures of business activity and health for the transactions and activities included.  All of this may be interpreted through an understanding of the operational domain of the businesses supply chain (a Domain).   We'll discuss the Spectrum in more depth in succeeding posts.

    Read the article

  • In which cases build artifacts will be different in different environments

    - by Sundeep
    We are working on automation of deployment using Jenkins. We have different environments - DEV, UAT, PROD. In SVN, we are tagging each release and placing same binaries in DEV, UAT, PROD. The artifacts already contains config files w.r.t each environment but I am not understanding why we are storing binaries in environment folder again. Are there any scenarios where deployment might be different for different environments.

    Read the article

  • Apress Deal of the Day - 13/Feb/2010 - Pro Hyper–V

    - by TATWORTH
    Today's Apresss $10 Deal of the Day at http://www.apress.com/info/dailydeal is In Pro Hyper–V, author Harley Stagner takes a comprehensive approach to acquiring, deploying, using, and troubleshooting Microsoft’s answer to virtualization on the Windows Server platform. Learn from a true virtualization guru all you need to know about deploying virtual machines, managing your library of VMs in your enterprise, recovering gracefully from failure scenarios, and migrating existing physical machines to virtual hardware.

    Read the article

  • Data Structures usage and motivational aspects

    - by Aubergine
    For long student life I was always wondering why there are so many of them yet there seems to be lack of usage at all in many of them. The opinion didn't really change when I got a job. We have brilliant books on what they are and their complexities, but I never encounter resources which would actually give a good hint of practical usage. I perfectly understand that I have to look at problem , analyse required operations, look for data structure that does them efficiently. However in practice I never do that, not because of human laziness syndrome, but because when it comes to work I acknowledge time priority over self-development. Over time I thought that when I would be better developer I will automatically use more of them - that didn't happen at all or maybe I just didn't. Then I found that the colleagues usually in the same plate as me - knowing more or less some three of data structures and being totally happy about it and refusing to discuss this matter further with me, coming back to conversations about 'cool new languages' 'libraries that do jobs for you' and the joy to work under scrumban etc. I am stuck with ArrayLists, Arrays and SortedMap , which no matter what I do always suffice or either I tweak them to be capable of fulfilling my task. Yes, it might be inefficient but do we really have to care if Intel increases performance over years no matter if we improve our skills? Does new Xeon or IBM machines really care what we use? What if I like build things, but I am not particularly excited whether it is n log(n) or just n? Over twenty years the processing power increased enormously, which gives us freedom of not being critical about which one to use? On top of that new more optimized languages appear which support multiple cores more efficiently. To be more specific: I would like to find motivational material on complex real areas/cases of possible effective usages of data structures. I would be really grateful if you would provide relevant resources. There is similar question ,but in the end the links again mostly describe or do dumb example(vehicles, students or holy grail quest - yes, very relevant) them and people keep referring to the "scenario decides the data structure to use". I want to know these complex scenarios to be able to identify similarities to my scenario and then use them. The complex scenarios where it really matters and not necessarily of quantitive nature. It seems that data structures only concern is efficiency and nothing else? There seems to be no particular convenience for developer in use one over another. (only when I found scientific resources on why exactly simple carbohydrates are evil I stopped eating sugar and candies completely replacing it with less harmful fruits - I hope you can see the analogy)

    Read the article

  • The Many-to-Many Revolution 2.0 #ssas #mdx #dax #m2m

    - by Marco Russo (SQLBI)
    In September 2006 I had announced in this blog the release of the first version of The Many-to-Many Revolution, a whitepaper that describes how to leverage the many-to-many dimension relationships feature that had being available since Analysis Services 2005. The paper contains many generic patterns that can be applied in many common data analysis’ scenarios. More than 5 years later and more then 20.000 unique people that downloaded the 1.0 paper, I am proud to announce that we released The Many-to-Many...(read more)

    Read the article

  • If immutable objects are good, why do people keep creating mutable objects?

    - by Vinoth Kumar
    If immutable objects are good,simple and offers benefits in concurrent programming why do programmers keep creating mutable objects? I have four years of experience in Java programming and as I see it, the first thing people do after creating a class is generate getters and setters in the IDE (thus making it mutable). Is there a lack of awareness or can we get away with using mutable objects in most scenarios?

    Read the article

  • Introducing Windows Azure Mobile Services

    - by Clint Edmonson
    Today I’m excited to share that the Windows Azure Mobile Services public preview is now available. This preview provides a turnkey backend cloud solution designed to accelerate connected client app development. These services streamline the development process by enabling you to leverage the cloud for common mobile application scenarios such as structured storage, user authentication and push notifications. If you’re building a Windows 8 app and want a fast and easy path to creating backend cloud services, this preview provides the capabilities you need. You to take advantage of the cloud to build and deploy modern apps for Windows 8 devices in anticipation of general availability on October 26th. Subsequent preview releases will extend support to iOS, Android, and Windows Phone. Features The preview makes it fast and easy to create cloud services for Windows 8 applications within minutes. Here are the key benefits:  Rapid development: configure a straightforward and secure backend in less than five minutes. Create modern mobile apps: common Windows Azure plus Windows 8 scenarios that Windows Azure Mobile Services preview will support include:  Automated Service API generation providing CRUD functionality and dynamic schematization on top of Structured Storage Structured Storage with powerful query support so a Windows 8 app can seamlessly connect to a Windows Azure SQL database Integrated Authentication so developers can configure user authentication via Windows Live Push Notifications to bring your Windows 8 apps to life with up to date and relevant information Access structured data: connect to a Windows Azure SQL database for simple data management and dynamically created tables. Easy to set and manage permissions. Pricing One of the key things that we’ve consistently heard from developers about using Windows Azure with mobile applications is the need for a low cost and simple offer. The simplest way to describe the pricing for Windows Azure Mobile Services at preview is that it is the same as Windows Azure Websites during preview. What’s FREE? Run up to 10 Mobile Services for free in a multitenant environment Free with valid Windows Azure Free Trial 1GB SQL Database Unlimited ingress 165MB/day egress  What do I pay for? Scaling up to dedicated VMs Once Windows Azure Free Trial expires - SQL Database and egress     Getting Started To start using Mobile Services, you will need to sign up for a Windows Azure free trial, if you have not done so already.  If you already have a Windows Azure account, you will need to request to enroll in this preview feature. Once you’ve enrolled, this getting started tutorial will walk you through building your first Windows 8 application using the preview’s services. The developer center contains more resources to teach you how to: Validate and authorize access to data using easy scripts that execute securely, on the server Easily authenticate your users via Windows Live Send toast notifications and update live tiles in just a few lines of code Our pricing calculator has also been updated for calculate costs for these new mobile services. Questions? Ask in the Windows Azure Forums. Feedback? Send it to [email protected].

    Read the article

  • How do I tell the cases when it's worth to use LINQ?

    - by Lijo
    Many things in LINQ can be accomplished without the library. But for some scenarios, LINQ is most appropriate. Examples are: SELECT - http://stackoverflow.com/questions/11883262/wrapping-list-items-inside-div-in-a-repeater SelectMany, Contains - http://stackoverflow.com/questions/11778979/better-code-pattern-for-checking-existence-of-value Enumerable.Range - http://stackoverflow.com/questions/11780128/scalable-c-sharp-code-for-creating-array-from-config-file WHERE http://stackoverflow.com/questions/13171850/trim-string-if-a-string-ends-with-a-specific-word What factors to take into account when deciding between LINQ and regular .Net language elements?

    Read the article

  • Registre des ventes pour les partenaires Oracle CRM et Inquira

    - by swalker
    Découvrez plus en détail le positionnement et les fonctionnalités d'Oracle CRM et d'InQuira. Grâce à ce guide pour la vente Oracle CRM & InQuira Playbook, vous avez toutes les cartes en main pour vendre, identifier et qualifier des opportunités et appliquer des scénarios de vente spécifiques. Découvrez également comment concentrer vos ressources et tirer profit des tremplins OPN Specialized pour élargir votre offre.

    Read the article

  • Do you know about the Visual Studio 2010 Database Projects Guidance?

    - by Martin Hinshelwood
    Early on in the Team System (now Visual Studio ALM) cycle a new product surfaced within Team System that was affectionately called “Data Dude”, but had the more formal name of “Visual Studio 2005 Team Edition for Database Professionals”. The purpose of this product was to try and make the database a “first class citizen” in the development world. Those that started using Visual Studio 2005 Team Edition for Database Professionals (Data Dude) loved it, but everyone else did not get it. The capabilities were a little patchy, but the one thing it did bring to the party was the ability to put your database schema under source control. This was revolutionary as previously your DBA sat as far away from the team as possible, and usually in a dark cupboard, now they could partake of all the goodness of Version Control, Work Item Tracking and automated builds. The problem was that the understanding required to manage these projects was very different to that needed previously. Then the Visual Studio ALM Rangers got a hold of it…and produced some of the best guidance available. Figure: Download the guidance from http://vsdatabaseguide.codeplex.com/ This guidance discusses scenarios and approaches of using the Database Projects in Visual Studio 2010 to help you use the tools more effectively and maximize their value to your organization This guidance is focused on these five areas: Solution and Project Management Source Code Control and Configuration Management Integrating External Changes with the Project System Build and Deployment Automation with Visual Studio Database Projects Database Testing and Deployment Verification Each of these areas has common guidance, usage scenarios, hands on labs, and lessons learned from real world engagements and the community discussions.   The guidance is broken down into three packages: Guidance documentation Hands-on-lab (HOL) documentation note: The documentation is available in XPS-only format packages or complete XPS,PDF,DOCX format packages HOL Package If you need assistance and no one else can help, then you may need to call the Visual Studio ALM Rangers. The Visual Studio ALM Rangers have the mission to provide out of band solutions for missing features or guidance. They are supported by Microsoft Product Group, Microsoft Consulting Services, Microsoft Most Valued Professionals (MVPs) and technical specialists from technology communities around the globe, giving you a real-world view from the field, where the technology has been tested and used. For more information on the Rangers please visit http://msdn.microsoft.com/en-us/vstudio/ee358786.aspx and for more a list of other Rangers projects please see http://msdn.microsoft.com/en-us/vstudio/ee358787.aspx.

    Read the article

  • lowering the use of the memory controller in OpenCL based applications

    - by user827992
    With my first experiments I noticed that OpenCL is a good technology but often hampered by the X86 architecture and finding a mid-range VGA driven by a low-end chipset is not that unusual in the real world scenarios, sometimes this can happen with some high-end VGA too. Are there some caching techniques? Something that can bypass this inconvenience in some ways. The amount of dedicated memory on today's VGA is usually high, it's possible to use this memory to create some kind of buffer with instructions.

    Read the article

  • What is a good one-stop-shop for understanding software licensing information?

    - by Macy Abbey
    I've learned a fair amount about the various different software licensing models and what those models mean for my own software project. However, I'd like to make sure I understand as many of them as possible for making decisions on how to license my own software and in what scenarios I can safely use software under a licensing model. Do you have a good recommendation for a book/site etc.. that has this information in one location?

    Read the article

  • Thinking in DAX: Counting Products in the Current Status with PowerPivot

    - by AlbertoFerrari
    One of my readers came to me with an interesting formula to compute in PowerPivot. Even if I don’t normally post about very specific scenarios, I think this time it is interesting to write a blog post since the formula can be easily created, if you think at it in DAX, while it is very hard if you are still approaching it with an MDX or SQL mindset. Thinking in DAX is something that comes after a lot of formula authoring, something that all BI professionals should strive for, as Vertipaq in the new...(read more)

    Read the article

  • O'Reilly 50% off selected Training Kit Ebooks to July 5, 2012 at 23:59 PT

    - by TATWORTH
    At http://shop.oreilly.com/category/deals/msp-training-kit-owo.do?code=WKMSPTK, there is 50% off a selection of Microsoft Press Training Kit ebooks" Make the most of your study time with Microsoft Press Training Kit ebooks. Work at your own pace through a series of lessons and reviews that fully cover exam objectives. Then, reinforce and apply your knowledge to real-world case scenarios and practice exercises to maximize your performance on the exams. For one week only, you can save 50% on these ebooks"

    Read the article

  • What if the Earth were Hollow? [Video]

    - by Asian Angel
    What would things be like if you dug a tunnel completely through the Earth for travel purposes or if our planet were hollow? Minute Physics takes a look at how things would be if either of these scenarios actually existed. What if the Earth were Hollow? [via Geeks are Sexy] How To Switch Webmail Providers Without Losing All Your Email How To Force Windows Applications to Use a Specific CPU HTG Explains: Is UPnP a Security Risk?

    Read the article

  • Transaction Boundaries and Rollbacks in Oracle SOA Suite

    - by Antonella Giovannetti
    A new eCourse/video is available in the Oracle Learning Library, "Transaction Boundaries and Rollbacks in Oracle SOA Suite" . The course covers: Definition of transaction, XA, Rollback and transaction boundary. BPEL transaction boundaries from a fault propagation point of view Parameters bpel.config.transaction and bpel.config.oneWayDeliveryPolicy for the configuration of both synchronous and asynchronous BPEL processes. Transaction behavior in Mediator Rollback scenarios based on type of faults Rollback using bpelx:rollback within a <throw> activity. The video is accessible here

    Read the article

  • APress Deal of the day 29/Jun/2013 - Pro SQL Database for Windows Azure

    - by TATWORTH
    Originally posted on: http://geekswithblogs.net/TATWORTH/archive/2013/06/29/apress-deal-of-the-day-29jun2013---pro-sql-database.aspxToday's $10 Deal of the day from APress at http://www.apress.com/9781430243953 is Pro SQL Database for Windows Azure"Pro SQL Database for Windows Azure, 2nd Edition introduces you to Microsoft's cloud-based delivery of its enterprise-caliber, SQL Server database management system—showing you how to program and administer it in a variety of cloud computing scenarios."

    Read the article

  • What is a good one-stop-shop for understanding software licensing information?

    - by Macy Abbey
    I've learned a fair amount about the various different software licensing models and what those models mean for my own software project. However, I'd like to make sure I understand as many of them as possible for making decisions on how to license my own software and in what scenarios I can safely use software under a licensing model. Do you have a good recommendation for a book/site etc.. that has this information in one location?

    Read the article

  • PowerPivot, Stocks Exchange and the Moving Average

    - by AlbertoFerrari
    In this post I want to analyze a data model that perform some basic computations over stocks and uses one of the most commonly needed formula there (and in many other business scenarios): a moving average over a specified period in time. Moving averages can be very easily modeled in PowerPivot, but there is the need for some care and, in this post, I try to provide a solid background on how to compute moving averages. In general, a moving average over period T1..T2 is the average of all the values...(read more)

    Read the article

  • Guide Pillar Axiom 600 Playbookpour les partenaires

    - by swalker
    Découvrez plus en détail le positionnement et les fonctionnalités de Pillar Axiom 600. Grâce au guide Pillar Axiom 600 Playbook, vous avez toutes les cartes en main pour vendre, identifier et qualifier des opportunités, appliquer des scénarios de vente spécifiques et vous distinguer de la concurrence. Découvrez également comment concentrer vos ressources et tirer profit des tremplins OPN Specialized pour élargir votre offre.

    Read the article

  • Windows Azure : Microsoft propose 3 mois d'essai gratuits à sa plateforme Cloud dédiée aux développeurs .NET, Web et Java

    Windows Azure : Microsoft propose 3 mois d'essai gratuits A sa plateforme Cloud dédiée aux développeurs Microsoft identifiait avec nous en octobre 2011 cinq types de scénarios d'utilisation d'Azure : les réseaux sociaux (Social Gaming, campagne marketing ou outil d'analyse tel que Superviz.in de Tequila Rapido), les objets connectés et les services associés (push de notification, stockage de données dans le cloud, synchronisation de différents devices via le cloud, etc.), les sites de e-commerce et les sites à forte audience, les domaines qui ont besoin d'une forte puissance de calcul (calcul de risque dans la finance, modélisation 3D, Rendering, simulation scientifique, etc.), et les applications mobiles (smar...

    Read the article

  • Oops, I left my kernel zone configuration behind!

    - by mgerdts
    Most people use boot environments to move in one direction.  A system starts with an initial installation and from time to time new boot environments are created - typically as a result of pkg update - and then the new BE is booted.  This post is of little interest to those people as no hackery is needed.  This post is about some mild hackery. During development, I commonly test different scenarios across multiple boot environments.  Many times, those tests aren't related to the act of configuring or installing zone and I so it's kinda handy to avoid the effort involved of zone configuration and installation.  A somewhat common order of operations is like the following: # beadm create -e golden -a test1 # reboot Once the system is running in the test1 BE, I install a kernel zone. # zonecfg -z a178 create -t SYSsolaris-kz # zoneadm -z a178 install Time passes, and I do all kinds of stuff to the test1 boot environment and want to test other scenarios in a clean boot environment.  So then I create a new one from my golden BE and reboot into it. # beadm create -e golden -a test2 # reboot Since the test2 BE was created from the golden BE, it doesn't have the configuration for the kernel zone that I configured and installed.  Getting that zone over to the test2 BE is pretty easy.  My test1 BE is really known as s11fixes-2. root@vzl-212:~# beadm mount s11fixes-2 /mnt root@vzl-212:~# zonecfg -R /mnt -z a178 export | zonecfg -z a178 -f - root@vzl-212:~# beadm unmount s11fixes-2 root@vzl-212:~# zoneadm -z a178 attach root@vzl-212:~# zoneadm -z a178 boot On the face of it, it would seem as though it would have been easier to just use zonecfg -z a178 create -t SYSolaris-kz within the test2 BE to get the new configuration over.  That would almost work, but it would have left behind the encryption key required for access to host data and any suspend image.  See solaris-kz(5) for more info on host data.  I very commonly have more complex configurations that contain many storage URIs and non-default resource controls.  Retyping them would be rather tedious.

    Read the article

< Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >