Search Results

Search found 2628 results on 106 pages for 'lexical analysis'.

Page 2/106 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • SQLAuthority News – Download Whitepaper – Enabling and Securing Data Entry with Analysis Services Writeback

    - by pinaldave
    SQL Server Analysis Service have many features which are commonly requested and many already exists in the system. Security Data Entry is very important feature and SSAS supports writeback feature.  Analysis Services is a tool for aggregating information and providing business users with the ability to analyze and support decision making in their business. By using the built-in writeback feature in Analysis Services, business users can also modify their data points to perform what-if analysis or supplement any existing data. The techniques described in this article derive from the author’s professional experience in the design and development of complex financial analysis applications used by various business groups in a large multinational company. Download Whitepaper Enabling and Securing Data Entry with Analysis Services Writeback. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Documentation, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, SQL White Papers, T SQL, Technology

    Read the article

  • Why some consider static analysis a testing and some do not?

    - by user970696
    Preparing myself also to ISTQB certification, I found they call static analysis actually as a static testing, while some engineering book distinct between static analysis and testing, which is the dynamic activity. I tent to think that static analysis is not a testing in the true sense as it does not test, it checks/verifies. But sure I would love to hear opinion of the true experts here. Thank you

    Read the article

  • central apache log analysis of many hosts

    - by Jason Antman
    We have 30+ apache httpd servers, and are looking to perform analysis on the logs both for historical trending and near "real time" monitoring/alerting. I'm mainly interested in things like error rates (4xx/5xx), response time, overall request rate, etc. but it would also be very useful to pull out more compute-intensive statistics like unique client IPs and user agents per unit of time. I'm leaning towards building this as a centralized collector/server/storage, and am also considering the possibility of storing non-apache logs (i.e. general syslog, firewall logs, etc.) in the same system. Obviously a large part of this will probably have to be custom (at least the connection between pieces and the parsing/analysis we do), but I haven't been able to find much information on people who have done stuff like this, at least at shops smaller than Google/Facebook/etc. who can throw their log data into a hundred-node compute cluster and run Map/Reduce on it. The main things I'm looking for are: - All open source - Some way of collecting logs from apache machines that isn't too resource-intensive, and transports them relatively quickly over the network - Some way of storing them (NoSQL? key-value store?) on the backend, for a given amount of time (and then rolling them up into historical averages) - In the middle of this, a way of graphing in near-real-time (probably also with some statistical analysis on it) and hopefully alerting off of those graphs. Any suggestions/pointers/ideas, to either "products"/projects or descriptions of how other people do this would be greatly helpful. Unfortunately, we're not exactly a new-age-y devops shop, lots of old stuff, homogeneous infrastructure, and strained boxes.

    Read the article

  • EPM Architecture: Reporting and Analysis

    - by Marc Schumacher
    Reporting and Analysis is the basis for all Oracle EPM reporting components. Through the Java based Reporting and Analysis web application deployed on WebLogic, it enables users to browse through reports for all kind of Oracle EPM reporting components. Typical users access the web application by browser through Oracle HTTP Server (OHS). Reporting and Analysis Web application talks to the Reporting and Analysis Agent using CORBA protocol on various ports. All communication to the repository databases (EPM System Registry and Reporting and Analysis database) from web and application layer is done using JDBC. As an additional data store, the Reporting and Analysis Agent uses the file system to lay down individual reports. While the reporting artifacts are stored on the file system, the folder structure and report based security information is stored in the relational database. The file system can be either local or remote (e.g. network share, network file system). If an external user directory is used, Reporting and Analysis services also communicate to this directory. The next post will cover WebAnalysis.

    Read the article

  • R and SPSS difference

    - by sfactor
    i will be analysing vast amount of network traffic related data shortly. i will pre-process the data in order to analyse it. i have found that R and SPSS are among the most popular tools for statistical analysis. i will also be generating quite a lot of graphs and charts. so i was wondering what is the basic difference between these two softwares. i am not asking which one is better. i just wanted to know what are the difference in terms of workflow between the two besides the fact that SPSS has a GUI. I will be mostly working with scripts in either case anyway so i wanted to know about the other differences.

    Read the article

  • layout analysis of text based pdf without ocr

    - by fastrack
    Before recognizing a pdf, OCR software do document layout analysis to determine which parts are texts, tables or images, as shown in the picture below. ![papercrop]http://cache.gawkerassets.com/assets/images/17/2011/07/papercrop.jpg I want to use some parts of the text while leaving out the others. So having a software marking those zones comes in handy. Papercrop does a decent job, but it has a bug of now showing some of the text in the pdf file. And OCR software can also do layout analysis, marking out "zones" which I can add or delete. But you have to OCR to do that. Since my pdfs are already text based, I don't want to waste so much time OCRing. So my question is, is there any software that automatically mark out those zones and let me manually manipulate them, without having to OCR? Thanks! Waiting for your help.

    Read the article

  • Building a home cluster - hardware and cost analysis

    - by ldigas
    Does anyone know some links / books / anything you can think of, that describe the process of building a little home cluster (when I say home, it doesn't necessarily mean for keeping at home - just means it's relatively cheap and small) for experimental purposes, with a special emphasis on what hardware would be adequate today, and some kind of cost analysis ? Although, if someone here's done it, I'd appreciate all the experience you can share.

    Read the article

  • Excel Free Text Survey Question Analysis

    - by joec
    I have to analyse a survey. The survey consists of some yes/no questions, some numeric questions and some questions like the following (free text where respondents have entered multiple answers). Do you have any social networking accounts (Facebook, Twitter, Myspace etc) Y N If yes, which ones _____________________________ Respondents answer: Facebook and Twitter How do I put these types of answer into Excel to gain some sort of useful analysis? Thanks. PS. I know Excel is not great for surveys, but can't spend $1000 on SPSS or similar.

    Read the article

  • Static analysis tool customization for any language

    - by Sam
    Hi, We are using a Tool in our project. This tool has its own language which is similar to Java. I am looking for a static analysis tool which can be applied to the new language. Are there any static analysis tools which can be customized to any languages? or Is there any document or any reference on how to develop the static analysis tool for our own languages? Thanks.

    Read the article

  • Static analysis framework for eclipse?

    - by autobiographer
    i just wanted to use eclipse tptp, a framework for static code analysis but the support for code analysis ended with tptp 4.5.0. 1. it seems that this version can not be integrated into the current eclipse galileo. am i right? 2. which language independant framework for eclipse would you use as an alternative for tptp static analysis which works with eclipse galileo?

    Read the article

  • Analysis and Design for Functional Programming

    - by edalorzo
    How do you deal with analysis and design phases when you plan to develop a system using a functional programming language like Haskell? My background is in imperative/object-oriented programming languages, and therefore, I am used to use case analysis and the use of UML to document the design of program. But the thing is that UML is inherently related to the object-oriented way of doing software. And I am intrigued about what would be the best way to develop documentation and define software designs for a system that is going to be developed using functional programming. Would you still use use case analysis or perhaps structured analysis and design instead? How do software architects define the high-level design of the system so that developers follow it? What do you show to you clients or to new developers when you are supposed to present a design of the solution? How do you document a picture of the whole thing without having first to write it all? Is there anything comparable to UML in the functional world?

    Read the article

  • Syntactical analysis with Flex/Bison part 2

    - by Imran
    Hallo, I need help in Lex/Yacc Programming. I wrote a compiler for a syntactical analysis for inputs of many statements. Now i have a special problem. In case of an Input the compiler gives the right output, which statement is uses, constant operator or a jmp instructor to which label, now i have to write so, if now a if statement comes, first the first command (before the else) must be give out when the assignment of the if is yes then it must jump to the end because the command after the else isnt needed, so after this jmp then the second command must be give out. I show it in an example maybe you understand what i mean. Input adr. Output if(x==0) 10 if(x==0) Wait 5 20 WAIT 5 else 30 JMP 50 Wait 1 40 WAIT 1 end 50 END like so. I have an idea, maybe i can do it whith a special if statement like IF exp jmp_stmt_end stmt_seq END when the if statement is given in the input the compiler has to recognize the end ofthe statement and like my jmp_stmt in my compiler ( you have to download the files from http://bitbucket.org/matrix/changed-tiny) only to jump to the end. I hope you understand my problem.thanks.

    Read the article

  • Theory: "Lexical Encoding"

    - by _ande_turner_
    I am using the term "Lexical Encoding" for my lack of a better one. A Word is arguably the fundamental unit of communication as opposed to a Letter. Unicode tries to assign a numeric value to each Letter of all known Alphabets. What is a Letter to one language, is a Glyph to another. Unicode 5.1 assigns more than 100,000 values to these Glyphs currently. Out of the approximately 180,000 Words being used in Modern English, it is said that with a vocabulary of about 2,000 Words, you should be able to converse in general terms. A "Lexical Encoding" would encode each Word not each Letter, and encapsulate them within a Sentence. // An simplified example of a "Lexical Encoding" String sentence = "How are you today?"; int[] sentence = { 93, 22, 14, 330, QUERY }; In this example each Token in the String was encoded as an Integer. The Encoding Scheme here simply assigned an int value based on generalised statistical ranking of word usage, and assigned a constant to the question mark. Ultimately, a Word has both a Spelling & Meaning though. Any "Lexical Encoding" would preserve the meaning and intent of the Sentence as a whole, and not be language specific. An English sentence would be encoded into "...language-neutral atomic elements of meaning ..." which could then be reconstituted into any language with a structured Syntactic Form and Grammatical Structure. What are other examples of "Lexical Encoding" techniques? If you were interested in where the word-usage statistics come from : http://www.wordcount.org

    Read the article

  • Minimal set of critical database operations

    - by Juan Carlos Coto
    In designing the data layer code for an application, I'm trying to determine if there is a minimal set of database operations (both single and combined) that are essential for proper application function (i.e. the database is left in an expected state after every data access call). Is there a way to determine the minimal set of database operations (functions, transactions, etc.) that are critical for an application to function correctly? How do I find it? Thanks very much!

    Read the article

  • SQLAuthority News – Download Whitepaper – SQL Server 2008 R2 Analysis Services Operations Guide

    - by pinaldave
    SQL Server Analysis Service (SSAS) has been always interesting subject for research. Analysis Services cubes are a very powerful tool in the hands of the business intelligence (BI) developer. They provide an easy way to expose even large data models directly to business users. Microsoft has published very informative white paper on Analysis Services Operations Guide. This white paper is authored by Thomas Kejser, John Sirmon, and Denny Lee. In this guide you will find information on how to test and run Microsoft SQL Server Analysis Services in SQL Server 2005, SQL Server 2008, and SQL Server 2008 R2 in a production environment. The focus of this guide is how you can test, monitor, diagnose, and remove production issues on even the largest scaled cubes. This paper also provides guidance on how to configure the server for best possible performance. It is the goal of this guide to make your operations processes as painless as possible, and to have you run with the best possible performance without any additional development effort to your deployed cubes. In this guide, you will learn how to get the best out of your existing data model by making changes transparent to the data model and by making configuration changes that improve the user experience of the cube. Download SQL Server 2008 R2 Analysis Services Operations Guide Note: Abstract taken white paper. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, SQL White Papers, SQLAuthority News, T SQL, Technology

    Read the article

  • Requirements Analysis in Game Development?

    - by Joey Green
    I'm a software engineering student with a focus on game development and am wondering how big of a part does requirement analysis play a part in game development? I'm asking because there is a class being offered and I could take it. It is all about requirements analysis. Here is a description: An in-depth study of current research and practice in requirements elicitation, requirements, analysis, requirements specification,requirements verification and validation, and requirements management. Would this type of knowledge be useful for an independent game developer?

    Read the article

  • SSDT gotcha – Moving a file erases code analysis suppressions

    - by jamiet
    I discovered a little wrinkle in SSDT today that is worth knowing about if you are managing your database schemas using SSDT. In short, if a file is moved to a different folder in the project then any code analysis suppressions that reference that file will disappear from the suppression file. This makes sense if you think about it because the paths stored in the suppression file are no longer valid, but you probably won’t be aware of it until it happens to you. If you don’t know what code analysis is or you don’t know what the suppression file is then you can probably stop reading now, otherwise read on for a simple short demo. Let’s create a new project and add a stored procedure to it called sp_dummy. Naming stored procedures with a sp_ prefix is generally frowned upon and hence SSDT static code analysis will look for occurrences of this and flag them. So, the next thing we need to do is turn on static code analysis in the project properties: A subsequent build causes a code analysis warning as we might expect: Let’s suppose we actually don’t mind stored procedures with sp_ prefixes, we can just right-click on the message to suppress and get rid of it: That causes a suppression file to get created in our project: Notice that the suppression file contains a relative path to the file that has had the suppression placed upon it. Now if we simply move the file within our project to a new folder notice that the suppression that we just created gets removed from the suppression file: As I alluded above this behaviour is intuitive because the path originally stored in the suppression file is no longer relevant but you’re probably not going to be aware of it until it happens to you and messages that you thought you had suppressed start appearing again. Definitely one to be aware of. @Jamiet   

    Read the article

  • Keyword Analysis Tools

    One of the most essential free webmaster tools is to possess a great analytics program. Free website analysis for websites and blogs is vital for success and involves plenty of capabilities, like traffic analysis. Free website analysis must present what pages are usually viewed the most.

    Read the article

  • Windows Azure Platform TCO/ROI Analysis Tool

    - by kaleidoscope
    Microsoft have released a tool to help you figure out how much money you can save by switching to Windows Azure from your on-premises solution. The tool will provide you with a customized estimate of potential cost savings you (or your company or organization) may achieve by building on the Windows Azure Platform. Upon completion of the TCO and ROI Calculator profile analysis, you will be presented with a detailed report which shows estimated line item costs for an accurate TCO and a 1 to 3 year ROI analysis for you or your company or organization. You should not interpret the analysis report you receive as a part of this process to be a commitment on the part of Microsoft, and Microsoft makes no guarantees regarding the accuracy of any information presented in the report. You should not view the results of this report as a substitute for engaging with a third party expert to independently evaluate you or your company’s specific computing needs. The analysis report you will receive is for informational purposes only. For more information check this link. Geeta, G

    Read the article

  • What follows after lexical analysis?

    - by madflame991
    I'm working on a toy compiler (for some simple language like PL/0) and I have my lexer up and running. At this point I should start working on building the parse tree, but before I start I was wondering: How much information can one gather from just the string of tokens? Here's what I gathered so far: One can already do syntax highlighting having only the list of tokens. Numbers and operators get coloured accordingly and keywords also. Autoformatting (indenting) should also be possible. How? Specify for each token type how many white spaces or new line characters should follow it. Also when you print tokens modify an alignment variable (when the code printer reads "{" increment the alignment variable by 1, and decrement by 1 for "}". Whenever it starts printing on a new line the code printer will align according to this alignment variable) In languages without nested subroutines one can get a complete list of subroutines and their signature. How? Just read what follows after the "procedure" or "function" keyword until you hit the first ")" (this should work fine in a Pascal language with no nested subroutines) In languages like Pascal you can even determine local variables and their types, as they are declared in a special place (ok, you can't handle initialization as well, but you can parse sequences like: "var a, b, c: integer") Detection of recursive functions may also be possible, or even a graph representation of which subroutine calls who. If one can identify the body of a function then one can also search if there are any mentions of other function's names. Gathering statistics about the code, like number of lines, instructions, subroutines EDIT: I clarified why I think some processes are possible. As I read comments and responses I realise that the answer depends very much on the language that I'm parsing.

    Read the article

  • SEO Secrets - On-Page Competition Analysis

    SEO Competition Analysis--in order to be complete, must always be undertaken in two equally important steps. Step One, is the Off-Page Competition Analysis. Step Two, is the On-Page Competition Analysis. This article will cover the the second step--in the context of assessing your competitors' websites. On-Page Elements--As evident from the term, these are the website qualities that can be found within the web pages of your competitors site.

    Read the article

  • FxCop / Code Analysis with VS2010 Ultimate

    - by Cuartico
    I've getting some information about this, but I still can find a proper answer, I was asked recently in my company for this : "run a fxcop analysis on that code and tell me the results". Ok, I have VS2010 Ultimate which has code analysis, but before making any comment, I browse it on the internet cause I want to implement the best choice... So, let's say I'm gonna use the same rules on both analyzers: Should I recommend using one above the other? Should I say "hey, thats kinda old, let's use code analysis!" Should I get the same results on different computers? (for what I undersand, fxcop gives you some "points" and for what I've read, sometimes it gives you diff points on diff computers, I don't know about this with code analysis Thanks, any help would be appreciated

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >