Search Results

Search found 4140 results on 166 pages for 'alias analysis'.

Page 6/166 | < Previous Page | 2 3 4 5 6 7 8 9 10 11 12 13  | Next Page >

  • The Sensemaking Spectrum for Business Analytics: Translating from Data to Business Through Analysis

    - by Joe Lamantia
    One of the most compelling outcomes of our strategic research efforts over the past several years is a growing vocabulary that articulates our cumulative understanding of the deep structure of the domains of discovery and business analytics. Modes are one example of the deep structure we’ve found.  After looking at discovery activities across a very wide range of industries, question types, business needs, and problem solving approaches, we've identified distinct and recurring kinds of sensemaking activity, independent of context.  We label these activities Modes: Explore, compare, and comprehend are three of the nine recognizable modes.  Modes describe *how* people go about realizing insights.  (Read more about the programmatic research and formal academic grounding and discussion of the modes here: https://www.researchgate.net/publication/235971352_A_Taxonomy_of_Enterprise_Search_and_Discovery) By analogy to languages, modes are the 'verbs' of discovery activity.  When applied to the practical questions of product strategy and development, the modes of discovery allow one to identify what kinds of analytical activity a product, platform, or solution needs to support across a spread of usage scenarios, and then make concrete and well-informed decisions about every aspect of the solution, from high-level capabilities, to which specific types of information visualizations better enable these scenarios for the types of data users will analyze. The modes are a powerful generative tool for product making, but if you've spent time with young children, or had a really bad hangover (or both at the same time...), you understand the difficult of communicating using only verbs.  So I'm happy to share that we've found traction on another facet of the deep structure of discovery and business analytics.  Continuing the language analogy, we've identified some of the ‘nouns’ in the language of discovery: specifically, the consistently recurring aspects of a business that people are looking for insight into.  We call these discovery Subjects, since they identify *what* people focus on during discovery efforts, rather than *how* they go about discovery as with the Modes. Defining the collection of Subjects people repeatedly focus on allows us to understand and articulate sense making needs and activity in more specific, consistent, and complete fashion.  In combination with the Modes, we can use Subjects to concretely identify and define scenarios that describe people’s analytical needs and goals.  For example, a scenario such as ‘Explore [a Mode] the attrition rates [a Measure, one type of Subject] of our largest customers [Entities, another type of Subject] clearly captures the nature of the activity — exploration of trends vs. deep analysis of underlying factors — and the central focus — attrition rates for customers above a certain set of size criteria — from which follow many of the specifics needed to address this scenario in terms of data, analytical tools, and methods. We can also use Subjects to translate effectively between the different perspectives that shape discovery efforts, reducing ambiguity and increasing impact on both sides the perspective divide.  For example, from the language of business, which often motivates analytical work by asking questions in business terms, to the perspective of analysis.  The question posed to a Data Scientist or analyst may be something like “Why are sales of our new kinds of potato chips to our largest customers fluctuating unexpectedly this year?” or “Where can innovate, by expanding our product portfolio to meet unmet needs?”.  Analysts translate questions and beliefs like these into one or more empirical discovery efforts that more formally and granularly indicate the plan, methods, tools, and desired outcomes of analysis.  From the perspective of analysis this second question might become, “Which customer needs of type ‘A', identified and measured in terms of ‘B’, that are not directly or indirectly addressed by any of our current products, offer 'X' potential for ‘Y' positive return on the investment ‘Z' required to launch a new offering, in time frame ‘W’?  And how do these compare to each other?”.  Translation also happens from the perspective of analysis to the perspective of data; in terms of availability, quality, completeness, format, volume, etc. By implication, we are proposing that most working organizations — small and large, for profit and non-profit, domestic and international, and in the majority of industries — can be described for analytical purposes using this collection of Subjects.  This is a bold claim, but simplified articulation of complexity is one of the primary goals of sensemaking frameworks such as this one.  (And, yes, this is in fact a framework for making sense of sensemaking as a category of activity - but we’re not considering the recursive aspects of this exercise at the moment.) Compellingly, we can place the collection of subjects on a single continuum — we call it the Sensemaking Spectrum — that simply and coherently illustrates some of the most important relationships between the different types of Subjects, and also illuminates several of the fundamental dynamics shaping business analytics as a domain.  As a corollary, the Sensemaking Spectrum also suggests innovation opportunities for products and services related to business analytics. The first illustration below shows Subjects arrayed along the Sensemaking Spectrum; the second illustration presents examples of each kind of Subject.  Subjects appear in colors ranging from blue to reddish-orange, reflecting their place along the Spectrum, which indicates whether a Subject addresses more the viewpoint of systems and data (Data centric and blue), or people (User centric and orange).  This axis is shown explicitly above the Spectrum.  Annotations suggest how Subjects align with the three significant perspectives of Data, Analysis, and Business that shape business analytics activity.  This rendering makes explicit the translation and bridging function of Analysts as a role, and analysis as an activity. Subjects are best understood as fuzzy categories [http://georgelakoff.files.wordpress.com/2011/01/hedges-a-study-in-meaning-criteria-and-the-logic-of-fuzzy-concepts-journal-of-philosophical-logic-2-lakoff-19731.pdf], rather than tightly defined buckets.  For each Subject, we suggest some of the most common examples: Entities may be physical things such as named products, or locations (a building, or a city); they could be Concepts, such as satisfaction; or they could be Relationships between entities, such as the variety of possible connections that define linkage in social networks.  Likewise, Events may indicate a time and place in the dictionary sense; or they may be Transactions involving named entities; or take the form of Signals, such as ‘some Measure had some value at some time’ - what many enterprises understand as alerts.   The central story of the Spectrum is that though consumers of analytical insights (represented here by the Business perspective) need to work in terms of Subjects that are directly meaningful to their perspective — such as Themes, Plans, and Goals — the working realities of data (condition, structure, availability, completeness, cost) and the changing nature of most discovery efforts make direct engagement with source data in this fashion impossible.  Accordingly, business analytics as a domain is structured around the fundamental assumption that sense making depends on analytical transformation of data.  Analytical activity incrementally synthesizes more complex and larger scope Subjects from data in its starting condition, accumulating insight (and value) by moving through a progression of stages in which increasingly meaningful Subjects are iteratively synthesized from the data, and recombined with other Subjects.  The end goal of  ‘laddering’ successive transformations is to enable sense making from the business perspective, rather than the analytical perspective.Synthesis through laddering is typically accomplished by specialized Analysts using dedicated tools and methods. Beginning with some motivating question such as seeking opportunities to increase the efficiency (a Theme) of fulfillment processes to reach some level of profitability by the end of the year (Plan), Analysts will iteratively wrangle and transform source data Records, Values and Attributes into recognizable Entities, such as Products, that can be combined with Measures or other data into the Events (shipment of orders) that indicate the workings of the business.  More complex Subjects (to the right of the Spectrum) are composed of or make reference to less complex Subjects: a business Process such as Fulfillment will include Activities such as confirming, packing, and then shipping orders.  These Activities occur within or are conducted by organizational units such as teams of staff or partner firms (Networks), composed of Entities which are structured via Relationships, such as supplier and buyer.  The fulfillment process will involve other types of Entities, such as the products or services the business provides.  The success of the fulfillment process overall may be judged according to a sophisticated operating efficiency Model, which includes tiered Measures of business activity and health for the transactions and activities included.  All of this may be interpreted through an understanding of the operational domain of the businesses supply chain (a Domain).   We'll discuss the Spectrum in more depth in succeeding posts.

    Read the article

  • How should I go about implementing a points-to analysis in Maude?

    - by reprogrammer
    I'm going to implement a points-to analysis algorithm. I'd like to implement this analysis mainly based on the algorithm by Whaley and Lam. Whaley and Lam use a BDD based implementation of Datalog to represent and compute the points-to analysis relations. The following lists some of the relations that are used in a typical points-to analysis. Note that D(w, z) :- A(w, x),B(x, y), C(y, z) means D(w, z) is true if A(w, x), B(x, y), and C(y, z) are all true. BDD is the data structure used to represent these relations. Relations input vP0 (variable : V, heap : H) input store (base : V, field : F, source : V) input load (base : V, field : F, dest : V) input assign (dest : V, source : V) output vP (variable : V, heap : H) output hP (base : H, field : F, target : H) Rules vP(v, h) :- vP0(v, h) vP(v1, h) :- assign(v1, v2), vP(v2, h) hP(h1, f,h2) :- store(v1, f, v2), vP(v1, h1), vP(v2, h2) vP(v2, h2) :- load(v1, f, v2), vP(v1, h1), hP(h1, f, h2) I need to understand if Maude is a good environment for implementing points-to analysis. I noticed that Maude uses a BDD library called BuDDy. But, it looks like that Maude uses BDDs for a different purpose, i.e. unification. So, I thought I might be able to use Maude instead of a Datalog engine to compute the relations of my points-to analysis. I assume Maude propagates independent information concurrently. And this concurrency could potentially make my points-to analysis faster than sequential processing of rules. But, I don't know the best way to represent my relations in Maude. Should I implement BDD in Maude myself, or Maude's internal unification based on BDD has the same effect?

    Read the article

  • SubSonic Alias/Where Clause

    - by JohnBob
    Hey, I want to convert the following SQL Query to a SubSonic Query. SELECT [dbo].[tbl_Agency].[ParentCompanyID] FROM [dbo].[tbl_Agency] WHERE REPLACE(PhoneNumber, ' ', '') LIKE REPLACE('%9481 1111%', ' ', '') I thought I would do it like below, but I just can't get it to produce valid SQL. //SubSonic string agencyPhoneNumber = "9481 1111"; SubSonic.SqlQuery subQueryagencyPhoneNumber = new SubSonic.Select(Agency.ParentCompanyIDColumn.ColumnName); subQueryagencyPhoneNumber.From(Agency.Schema.TableName); //WHERE subQueryagencyPhoneNumber.Where("REPLACE(" + Agency.PhoneNumberColumn.ColumnName + ", ' ', '')").Like("%" + agencyPhoneNumber + "%"); Does anyone out there know how to fix this - I'm using SubSonic 2.2. I feel like I'm taking crazy pills here - this should be straightforward, right? Cheers, JohnBob

    Read the article

  • sql query with alias name

    - by Ranjana
    i have a table with this columns--- Or orgid ispaid validity noofthingstoTake 1 yes 2010-06-05 20 2 yes 2010-06-09 7 i have used this query(to join two more tableS): select distinct B.RequirementID,A.OrganizationID from Organization A,RequirementsDetailsforOrganization B,validityorgdet F where A.OrganizationID=B.OrganizationID and F.orgid=A.OrganizationID and F.ispaid=1 and F.validity>=GETDATE() and F.noofthingstoTake> ?? but i dont know how to check the (noofthingstaken) over here. it should not exceed 20. im passing this query from my code behind page to the Sql. how to get the query excute to check it should not exceed the noofthingstaken pls help me out....????

    Read the article

  • Which tool can list writing access to a specific variable in C?

    - by Lichtblitz
    Unfortunately I'm not even sure how this sort of static analysis is called. It's not really control flow analysis because I'm not looking for function calls and I don't really need data flow analysis because I don't care about the actual values. I just need a tool that lists the locations (file, function) where writing access to a specific variable takes place. I don't even care if that list contained lines that are unreachable. I could imagine that writing a simple parser could suffice for this task but I'm certain that there must be a tool out there that does this simple analysis. As a poor student I would appreciate free or better yet open source tools and if someone could tell me how this type of static analysis is actually called, I would be equally grateful! EDIT: I forgot to mention there's no pointer arithmetic in the code base.

    Read the article

  • Static code analysis for VB6 and classic ASP

    - by Ryan
    I'm looking for a static code analysis tool that will determine if I have orphaned functions in my VB6 code. The problem I'm running into is we make calls to the VB6 code from classic asp. Is there a tool that will look at both the classic asp and VB6 and determine if there are any orphaned functions?

    Read the article

  • Static code analysis tools for VB6

    - by Maksym Markov
    Right now we are maintaining some old project written in VB6 we are planning to implement continues integration sefver for it. We would like to implement some code analysis as well to track that maintanability at least not getting worse. Basically there is only one requirement - the tool should be command line so we can call it from constinues integration server and it should work with VB6 projects. I will really a;preciate any recommendations regards tools to try. Thank you, Maksym

    Read the article

  • Code Analysis - Treat as Error

    - by Brian Schmitt
    Looking to enable the "Enable code Analysis on Build" feature in Visual Studio. Obviously the Rules are a best practice, and I am working with an existing code base that currently fails many of the rules. I am looking for input as to which rules are the most egregious and should be treated as an Error.

    Read the article

  • Javascript source code analysis ( specifically duplication checking )

    - by David
    Partial duplicate of this Notes: I already use JSLint extensively via a tool I wrote that scans in intervals my current project directory for recently updated/created .js files. It's drastically improved productivity for me and I doubt there is anything as good as JSLint for the price (it's free). That said, is there any analysis tool out there that can find repetitive or near-duplicate code blocks, the goal being to make it easier to find opportunities to consolidate large files or small/medium sized projects?

    Read the article

  • Glassfish log files analysis

    - by Cem
    Can I get some recommendations for good log analysis software for Glassfish log files? Since it will not vary from application server to application server dramatically, I guess that there is a common solution for all servers. Thanks

    Read the article

  • .Net Analysis tools [closed]

    - by TWith2Sugars
    Possible Duplicate: What static analysis tools are available for C#? At work we tend to use two tools for analysing our projects, FxCop to analyse our managed code and StyleCop to have consistent code layout. I found these tools pretty much by accident and it has led me to wonder what other tools are available that I might of missed?

    Read the article

  • Oracle Solaris Crash Analysis Tool 5.3 now available

    - by user12609056
    Oracle Solaris Crash Analysis Tool 5.3 The Oracle Solaris Crash Analysis Tool Team is happy to announce the availability of release 5.3.  This release addresses bugs discovered since the release of 5.2 plus enhancements to support Oracle Solaris 11 and updates to Oracle Solaris versions 7 through 10. The packages are available on My Oracle Support - simply search for Patch 13365310 to find the downloadable packages. Release Notes General blast support The blast GUI has been removed and is no longer supported. Oracle Solaris 2.6 Support As of Oracle Solaris Crash Analysis Tool 5.3, support for Oracle Solaris 2.6 has been dropped. If you have systems running Solaris 2.6, you will need to use Oracle Solaris Crash Analysis Tool 5.2 or earlier to read its crash dumps. New Commands Sanity Command Though one can re-run the sanity checks that are run at tool start-up using the coreinfo command, many users were unaware that they were. Though these checks can still be run using that command, a new command, namely sanity, can now be used to re-run the checks at any time. Interface Changes scat_explore -r and -t option The -r option has ben added to scat_explore so that a base directory can be specified and the -t op[tion was added to enable color taggging of the output. The scat_explore sub-command now accepts new options. Usage is: scat --scat_explore [-atv] [-r base_dir] [-d dest] [unix.N] [vmcore.]N Where: -v Verbose Mode: The command will print messages highlighting what it's doing. -a Auto Mode: The command does not prompt for input from the user as it runs. -d dest Instructs scat_explore to save it's output in the directory dest instead of the present working directory. -r base_dir Instructs scat_explore to save it's under the directory base_dir instead of the present working directory. If it is not specified using the -d option, scat_explore names it's output file as "scat_explore_system_name_hostid_lbolt_value_corefile_name." -t Enable color tags. When enabled, scat_explore tags important text with colors that match the level of importance. These colors correspond to the color normally printed when running Oracle Solaris Crash Analysis Tool in interactive mode. Tag Name Definition FATAL An extremely important message which should be investigated. WARNING A warning that may or may not have anything to do with the crash. ERROR An error, usually printer with a suggested command ALERT Used to indicate something the tool discovered. INFO Purely informational message INFO2 A follow-up to an INFO tagged message REDZONE Usually used when prnting memory info showing something is in the kernel's REDZONE. N The number of the crash dump. Specifying unix.N vmcore.N is optional and not required. Example: $ scat --scat_explore -a -v -r /tmp vmcore.0 #Output directory: /tmp/scat_explore_oomph_833a2959_0x28800_vmcore.0 #Tar filename: scat_explore_oomph_833a2959_0x28800_vmcore.0.tar #Extracting crash data... #Gathering standard crash data collections... #Panic string indicates a possible hang... #Gathering Hang Related data... #Creating tar file... #Compressing tar file... #Successful extraction SCAT_EXPLORE_DATA_DIR=/tmp/scat_explore_oomph_833a2959_0x28800_vmcore.0 Sending scat_explore results The .tar.gz file that results from a scat_explore run may be sent using Oracle Secure File Transfer. The Oracle Secure File Transfer User Guide describes how to use it to send a file. The send_scat_explore script now has a -t option for specifying a to address for sending the results. This option is mandatory. Known Issues There are a couple known issues that we are addressing in release 5.4, which you should expect to see soon: Display of timestamps in threads and clock information is incorrect in some cases. There are alignment issues with some of the tables produced by the tool.

    Read the article

  • Implementing User-Defined Hierarchies in SQL Server Analysis Services

    To be able to drill into multidimensional cube data at several levels, you must implement all of the hierarchies on the database dimensions. Then you'll create the attribute relationships necessary to optimize performance. Analysis Services hierarchies offer plenty of possibilities for displaying the data that your business requires. Rob Sheldon continues his series on SQL Server Analysis Services 2008.

    Read the article

  • Implementing User-Defined Hierarchies in SQL Server Analysis Services

    To be able to drill into multidimensional cube data at several levels, you must implement all of the hierarchies on the database dimensions. Then you'll create the attribute relationships necessary to optimize performance. Analysis Services hierarchies offer plenty of possibilities for displaying the data that your business requires. Rob Sheldon continues his series on SQL Server Analysis Services 2008.

    Read the article

  • How to Do Competition Analysis

    One of the most important aspects of SEO is the work you put in before you even touch the website or build a single back link. This analysis work involves keyword research and competition analysis. Choose the wrong keywords and you could be wasting all your efforts in the onsite and offsite optimization. Choose keywords which have too much competition and you'll be taking on an uphill battle.

    Read the article

  • A new Excel 2010 book for Data Analysis

    - by Marco Russo (SQLBI)
    Microsoft Press just announced the printing of Microsoft Excel 2010: Data Analysis and Business Modeling , which is the third edition of the book written by Wayne L. Winston covering many data analysis and modeling techniques using a very clear problem-solution approach, including a good statistical explanation whenever it is necessary. I suggest this book as a good complement to our Microsoft PowerPivot for Excel 2010: Give Your Data Meaning !...(read more)

    Read the article

  • REAL PRACTICES: Performance Scaling Microsoft SQL Server 2008 Analysis Services at Microsoft adCenter

    This white paper explains how Microsoft® adCenter implemented a Microsoft SQL Server® 2008 Analysis Services Scalable Shared Database on EMC® Symmetrix VMAX™ storage. Leveraging TimeFinder® clones and Enterprise Flash Drives with the read-only feature of SQL Server 2008 Analysis Services allowed adCenter to dramatically scale out OLAP while maintaining SLAs and decreasing system outages.

    Read the article

  • Deciphering a Search Engine Optimization Analysis

    A search engine optimization analysis is a tool that web developers can use to track how well their sites are showing up on the most popular search engines. There are several types of analysis software and services available that will give a good reading of your website's real optimization level. Ideally, a business website will be ranked near the top of search engine results, which will drive more traffic to the site.

    Read the article

  • VS2010 Custom Code Analysis Rule

    - by devlife
    I'm trying to write a custom fxcop rule for mstest projects VS2010. I'd like to debug it but keep getting an exception when it tries to load the dll for the mstest project it fails stating that it can't find referenced assembly: Microsoft.FxCop.Common.AssemblyLoadException Could not load C:\Users\Administrator\Documents\Visual Studio 2010\Projects\20100106-CodeAnalysisRulesBlogDemo\BlogDemo\TestProject1\bin\Debug\TestProject1.dll. Microsoft.FxCop.Sdk.InvalidMetadataException The following error was encountered while reading module 'TestProject1': Assembly reference cannot be resolved: Microsoft.VisualStudio.QualityTools.UnitTestFramework, Version=10.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a. Does anyone have any idea how to resolve this? If I just run the code analysis it works fine but as soon as I try to debug it fails. Thanks

    Read the article

  • Static analysis of multiple if statements (conditions)

    - by koppernickus
    I have code similar to: if conditionA(x, y, z) then doA() else if conditionB(x, y, z) then doB() ... else if conditionZ(x, y, z) then doZ() else throw ShouldNeverHappenException I would like to validate two things (using static analysis): If all conditions conditionA, conditionB, ..., conditionZ are mutually exclusive (i.e. it is not possible that two or more conditions are true in the same time). All possible cases are covered, i.e. "else throw" statement will never be called. Could you recommend me a tool and/or a way I could (easily) do this? I would appreciate more detailed informations than "use Prolog" or "use Mathematica"... ;-)

    Read the article

< Previous Page | 2 3 4 5 6 7 8 9 10 11 12 13  | Next Page >