Search Results

Search found 2936 results on 118 pages for 'logfile analysis'.

Page 11/118 | < Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >

  • Write DAX queries in Report Builder #ssrs #dax #ssas #tabular

    - by Marco Russo (SQLBI)
    If you use Report Builder with Reporting Services, you can use DAX queries even if the editor for Analysis Services provider does not support DAX syntax. In fact, the DMX editor that you can use in Visual Studio editor of Reporting Services (see a previous post on that), is not available in Report Builder. However, as Sagar Salvi commented in this Microsoft Connect entry, you can use the DAX query text in the query of a Dataset by using the OLE DB provider instead of the Analysis Services one. I think it’s a good idea to show the steps required. First, create a DataSet using the OLE DB connection type, and provide the connection string the provider (Provider), the server name (Data Source) and the database name (Initial Catalog), such as: Provider=MSOLAP;Data Source=SERVERNAME\\TABULAR;Initial Catalog=AdventureWorks Tabular Model SQL 2012 Then, create a Dataset using the data source previously defined, select the Text query type, and write the DAX code in the Query pane: You can also use the Query Designer window, that doesn’t provide any particular help in writing the DAX query, but at least can show a preview of the result of the query execution. I hope DAX will get better editors in the future… in the meantime, remember you can use DAX Studio to write and test your DAX queries, and DAX Formatter to improve their readability!If you want to learn the DAX Query Language, I suggest you watching my video Data Analysis Expressions as a Query Language on Project Botticelli!

    Read the article

  • Write DAX queries in Report Builder #ssrs #dax #ssas #tabular

    - by Marco Russo (SQLBI)
    If you use Report Builder with Reporting Services, you can use DAX queries even if the editor for Analysis Services provider does not support DAX syntax. In fact, the DMX editor that you can use in Visual Studio editor of Reporting Services (see a previous post on that), is not available in Report Builder. However, as Sagar Salvi commented in this Microsoft Connect entry, you can use the DAX query text in the query of a Dataset by using the OLE DB provider instead of the Analysis Services one. I think it’s a good idea to show the steps required. First, create a DataSet using the OLE DB connection type, and provide the connection string the provider (Provider), the server name (Data Source) and the database name (Initial Catalog), such as: Provider=MSOLAP;Data Source=SERVERNAME\\TABULAR;Initial Catalog=AdventureWorks Tabular Model SQL 2012 Then, create a Dataset using the data source previously defined, select the Text query type, and write the DAX code in the Query pane: You can also use the Query Designer window, that doesn’t provide any particular help in writing the DAX query, but at least can show a preview of the result of the query execution. I hope DAX will get better editors in the future… in the meantime, remember you can use DAX Studio to write and test your DAX queries, and DAX Formatter to improve their readability!If you want to learn the DAX Query Language, I suggest you watching my video Data Analysis Expressions as a Query Language on Project Botticelli!

    Read the article

  • Best tools for "ssh tail -f" style log file monitoring and analysis

    - by dougnukem
    I'm looking for a tool to monitor custom PHP Error logs/Apache and possibly Java logs on remote development servers. I'm not looking for a full production log system like Splunk, but something that's a little more flexible than an ssh terminal doing a "tail -f". Perhaps something that will: * Monitor multiple log files to my local machine for searching/analysis later * Allow "alerts" when certain strings appear in the log * Provide some kind of tabbed/dashboard view of the multiple logs being monitored (in total less than 10 logs).

    Read the article

  • VS2010 / Code Analysis: Turn off a rule for a project without custom ruleset....

    - by TomTom
    ...any change? The scenario is this: For our company we develop a standard how code should look. This will be the MS full rule set as it looks now. For some specific projects we may want to turn off specific rules. Simply because for a specific project this is a "known exception". Example? CA1026 - while perfectly ok in most cases, there are 1-2 specific libraries we dont want to change those. We also want to avoid having a custom rule set. OTOH putting in a suppress attribute on every occurance gets pretty convoluted pretty fast. Any way to turn off a code analysis warning for a complete assembly without a custom rule set? We rather have that in a specific file (GlobalSuppressions.cs) than in a rule set for maintenance reasons, and to be more explicit ;)

    Read the article

  • Why do I get Code Analysis CA1062 on an out parameter in this code?

    - by brickner
    I have a very simple code (simplified from the original code - so I know it's not a very clever code) that when I compile in Visual Studio 2010 with Code Analysis gives me warning CA1062: Validate arguments of public methods. public class Foo { protected static void Bar(out int[] x) { x = new int[1]; for (int i = 0; i != 1; ++i) x[i] = 1; } } The warning I get: CA1062 : Microsoft.Design : In externally visible method 'Foo.Bar(out int[])', validate local variable '(*x)', which was reassigned from parameter 'x', before using it. I don't understand why do I get this warning and how can I resolve it without suppressing it? Can new return null? Is this a Visual Studio 2010 bug?

    Read the article

  • What is this for an IP in my google app engine log file?

    - by Christian Harms
    I get many normal log lines in my google app engine application. But today I go these instead the 4-part number: 2a01:e35:2f20:f770:6c54:3ee8:67fb:df8 What is this for an format? ipv6 are 6 numbers, mac address too... Normal logfile line: 187.14.44.208 - - [19/Mar/2010:14:31:35 -0700] "GET /geo_data.js HTTP/1.1" 200 776 "http://www.xxx.com.br/spl19/index.php?refid=gv_av_ri" "Mozilla/5.0 (Windows; U; Windows NT 5.1; pt-BR; rv:1.9.2) Gecko/20100115 Firefox/3.6 (.NET CLR 3.5.30729),gzip(gfe)" This special logfile line: 2a01:e35:2f20:f770:6c54:3ee8:67fb:df8 - - [18/Mar/2010:17:00:37 -0700] "GET /geo_data.js HTTP/1.1" 500 450 "http://www.xxx.com.br/spl19/index.php?refid=cm_av_ri" "Mozilla/5.0 (Windows; U; Windows NT 6.1; pt-PT; rv:1.9.2) Gecko/20100115 Firefox/3.6,gzip(gfe)"

    Read the article

  • When to stop following the advice of static code analysis?

    - by bananeweizen
    I do use static code analysis on a project with more than 100.000 lines of Java code for quite a while now. I started with Findbugs, which gave me around 1500 issues at the beginning. I fixed the most severe over time and started using additional tools like PMD, Lint4J, JNorm and now Enerjy. With the more severe issues being fixed, there is a huge number of low severity issues. How do you handle these low priority issues? Do you try fixing all of them? Or only in newly written code? Do you regularly disable certain rules? (I found that I do on nearly any of the available tools). And if you ignore or disable rules, do you document those? What do your managers say about "leaving some thousand low priority issues not fixed"? Do you use (multiple) tool specific comments in the code or is there any better way?

    Read the article

  • How can I load an MP3 or similar music file for display and analysis in wxWidgets?

    - by Jon Cage
    I'm developing a GUI in wxPython which allows a user to generate sequences of colours for some toys I'm building. Part of the program needs to load an MP3 (and potentially other formats further down the line) and display it to the user. That should be sufficient to get started but later I'd like to add features like identifying beats and some crude frequency analysis. Is there any simple way of loading / understanding an MP3's contents to display a plot of its amplitudes to the screen using wxWidgets? I later intend to port to C++/wxWidgets for speed and to avoid having to distribute wxPython.

    Read the article

  • Rapid spectral analysis of audio file using Python 2.6?

    - by Ephemeralis
    What I want to do is to have a subroutine that analyses every 200 milliseconds of a sound file which it is given and spits out the frequency intensity value (from 0 to 1 as a float) of a specific frequency range into an array which I later save. This value then goes on to be used as the opacity value for a graphic which is supposed to 'strobe' to the audio file. The problem is, I have never ventured into audio analysis before and have no clue where to start. I have looked pymedia and scipy/numpy thinking I would be able to use FFT in order to achieve this, but I am not really sure how I would manipulate this data to end up with the desired result. The documentation on the SpectrAnalyzer class of pymedia is virtually non-existant and the examples on the website do not actually work with the latest release of the library - which isn't exactly making my life easier. How would I go about starting this project? I am at a complete loss as to what libraries I should even be using.

    Read the article

  • Java static source analysis/parsing (possibly with antlr), what is a good tool to do this?

    - by Berlin Brown
    I need to perform static source analysis on Java code. Ideally, I want the system to work out of the box without much modification from me. For example, I have used Antlr in the past, but I spent a lot of time building grammar files and still didn't get what I wanted. I want to be able to parse a java file and have return the character position of say: Character position start and end of a Java block comment Character position start and end of a Java class file Character position start and end of a Java method declaration, signature, and implementation. It looks like Antlr will do that, but I have yet to finish a grammar that actually gives me the positions of the code I need. Does anyone have that complete Antlr grammar and Java code to give the character positions of the parts in the Java source.

    Read the article

  • Cannot connect to a SQL Server 2005 Analysis Services cube after installing SQL Server 2008 SP1.

    - by Luc
    I've been developing an application that talks directly to an SSAS 2005 OLAP cube. Note that I also have SQL Server 2008 installed, so the other day I did a Windows Update and decided to include SQL Server 2008 SP1 in my update. After doing that, my SSAS 2005 cube is no longer accessible from my application. I'm able to browse the data just fine within SQL Server 2005 BI Studio Manager, but I'm not able to connect to the cube from my application. Here is my connection string that used to work: Data Source=localhost;Provider=msolap;Initial Catalog=Adventure Works DW Here is the error message I get: Either the user, [Server]/[User], does not have access to the Adventure Works DW database, or the database does not exist. Here is the beginning of my stack trace if it would help: Microsoft.AnalysisServices.AdomdClient.AdomdErrorResponseException was unhandled by user code HelpLink="" Message="Either the user, Luc-PC\\Luc, does not have access to the Adventure Works DW database, or the database does not exist." Source="Microsoft SQL Server 2005 Analysis Services" ErrorCode=-1055391743 StackTrace: at Microsoft.AnalysisServices.AdomdClient.AdomdConnection.XmlaClientProvider.Microsoft.AnalysisServices.AdomdClient.IDiscoverProvider.Discover(String requestType, IDictionary restrictions, DataTable table) at Microsoft.AnalysisServices.AdomdClient.ObjectMetadataCache.Discover(AdomdConnection connection, String requestType, ListDictionary restrictions, DataTable destinationTable, Boolean doCreate) at Microsoft.AnalysisServices.AdomdClient.ObjectMetadataCache.PopulateSelf() at Microsoft.AnalysisServices.AdomdClient.ObjectMetadataCache.Microsoft.AnalysisServices.AdomdClient.IObjectCache.Populate() at Microsoft.AnalysisServices.AdomdClient.CacheBasedNotFilteredCollection.PopulateCollection() at Microsoft.AnalysisServices.AdomdClient.CacheBasedNotFilteredCollection.get_Count() at Microsoft.AnalysisServices.AdomdClient.CubesEnumerator.MoveNext() at Microsoft.AnalysisServices.AdomdClient.CubeCollection.Enumerator.MoveNext() at blah blah... I've looked for a solution for the last 4+ hours and haven't had any success. Thanks in advance for any help. Luc

    Read the article

  • Looking for an open source real-time network analysis program

    - by JrSysAdmin
    Can somebody recommend an open source real-time network analysis program? What I'm looking for the program to do is display a graph of bandwidth usage by IP within our internal network that can quickly be viewed any time we need to (typically when we want to quickly find out who is utilizing high amounts of bandwidth and slowing down the network). We ideally simply want to hook up a monitor on the wall of our server room to a system whose NIC will be in permissive mode to log all network activity in a visual manner which can easily be seen and running 24/7. Prefer open source as I do not have a budget for this project and prefer open source projects in general. I'd also prefer for this to be available for CentOS but any linux distro or Windows OS would be acceptable. Thanks!

    Read the article

  • Eclipse CDT code analysis thinks size_t is ambiguous

    - by Chris
    It does, after all, get defined in stddef.h AND c++config.h: c++config.h: namespace std { typedef __SIZE_TYPE__ size_t; typedef __PTRDIFF_TYPE__ ptrdiff_t; #ifdef __GXX_EXPERIMENTAL_CXX0X__ typedef decltype(nullptr) nullptr_t; #endif } stddef.h: #define __SIZE_TYPE__ long unsigned int So when a file does using namespace std, the Eclipse CDT code analysis gets confused and says the symbol is ambiguous. I don't know how gcc works around this, but does anybody have any suggestions on what to do for the eclipse code analysis?

    Read the article

  • C++ Professional Code Analysis Tools

    - by Voulnet
    Hello there, I would like to ask about the available (free or not) Static and Dynamic code analysis tools that can be used to C++ applications ESPECIALLY COM and ActiveX. I am currently using Visual Studio's /analyze compiler option, which is good and all but I still feel there is lots of analysis to be done. I'm talking about a C++ application where memory management and code security is of utmost importance.

    Read the article

  • Prinicipal component analysis c#

    - by vj4u
    Hi, im presently working with data in text files i need to use algorithm called principal component analysis so i have counted the words in text filw which occurred more than one time in text file for eg relation occured times help occured 6 times between OCCURED 3 TIMES Analysis occurred 4 times component occured 5 times present occurred 6 times so by taking count of above distinct words i need to form matrix of m x n in c# help me its bit urgent for me

    Read the article

  • SSAS Maestro Training in July 2012 #ssasmaestro #ssas

    - by Marco Russo (SQLBI)
    A few hours ago Chris Webb blogged about SSAS Maestro and I’d like to propagate the news, adding also some background info. SSAS Maestro is the premier certification on Analysis Services that selects the best experts in Analysis Services around the world. In 2011 Microsoft organized two rounds of training/exams for SSAS Maestros and up to now only 11 people from the first wave have been announced – around 10% of attendees of the course! In the next few days the new Maestros from the second round should be announced and this long process is caused by many factors that I’m going to explain. First, the course is just a step in the process. Before the course you receive a list of topics to study, including the slides of the course. During the course, students receive a lot of information that might not have been included in the slides and the best part of the course is class interaction. Students are expected to bring their experience to the table and comparing case studies, experiences and having long debates is an important part of the learning process. And it is also a part of the evaluation: good questions might be also more important than good answers! Finally, after the course, students have their homework and this may require one or two months to be completed. After that, a long (very long) evaluation process begins, taking into account homework, labs, participation… And for this reason the final evaluation may arrive months later after the course. We are going to improve and shorten this process with the next courses. The first wave of SSAS Maestro had been made by invitation only and now the program is opening, requiring a fee to participate in order to cover the cost of preparation, training and exam. The number of attendees will be limited and candidates will have to send their CV in order to be admitted to the course. Only experienced Analysis Services developers will be able to participate to this challenging program. So why you should do that? Well, only 10% of students passed the exam until now. So if you need 100% guarantee to pass the exam, you need to study a lot, before, during and after the course. But the course by itself is a precious opportunity to share experience, create networking and learn mission-critical enterprise-level best practices that it’s hard to find written on books. Oh, well, many existing white papers are a required reading *before* the course! The course is now 5 days long, and every day can be *very* long. We’ll have lectures and discussions in the morning and labs in the afternoon/evening. Plus some more lectures in one or two afternoons. A heavy part of the course is about performance optimization, capacity planning, monitoring. This edition will introduce also Tabular models, and don’t expect something you might find in the SSAS Tabular Workshop – only performance, scalability monitoring and optimization will be covered, knowing Analysis Services is a requirement just to be accepted! I and Chris Webb will be the teachers for this edition. The course is expensive. Applying for SSAS Maestro will cost around 7000€ plus taxes (reduced to 5000€ for students of a previous SSAS Maestro edition). And you will be locked in a training room for the large part of the week. So why you should do that? Well, as I said, this is a challenging course. You will not find the time to check your email – the content is just too much interesting to think you can be distracted by something else. Another good reason is that this course will take place in Italy. Well, the course will take place in the brand new Microsoft Innovation Campus, but in general we’ll be able to provide you hints to get great food and, if you are willing to attach one week-end to your trip, there are plenty of places to visit (and I’m not talking about the classic Rome-Florence-Venice) – you might really need to relax after such a week! Finally, the marking process after the course will be faster – we’d like to complete the evaluation within three months after the course, considering that 1-2 months might be required to complete the homework. If at this point you are not scared: registration will open in mid-April, but you can already write to [email protected] sending your CV/resume and a short description of your level of SSAS knowledge and experience. The selection process will start early and you may want to put your admission form on top of the FIFO queue!

    Read the article

  • Need explanation on theorem form the book [closed]

    - by Pradeep
    I need some explanation on amortization analysis with respect to analysis of algorithm. I need some more explanation on one of the theorem attached. Explanation needed: 1. How did the author derive at Mij is O (ij-ij-1)? 2. Need explanation for quoted from the book " because at most ij-ij-1 -1 elements have been added into the table since the clear operation Mij-1 or since the beginning of the series." 3. Also what does the summation equation mean? need some more thorough explanation and the essence of the theorem. Removed Attached is the scan copy of the page from the Book

    Read the article

  • Crash dump analysis

    - by Ryan Ries
    I hope this isn't a stupid question, and if it is, then I want to at least get it over with so I don't feel so dumb in the future. Here we are, loading up a Windows crash dump with Windbg. Here are the first few lines of the debugger output: 0: kd> .dumpdebug ----- 64 bit Kernel Summary Dump Analysis DUMP_HEADER64: MajorVersion 0000000f MinorVersion 00001db1 ... The MinorVersion I mostly understand. It's hexadecimal and it translates to 7601 in decimal. Windows admins would already be able to tell from that that this must be either a Win7 x64 machine or a 2k8 R2 machine with SP1. But isn't 7601 the build number? It's supposed to be Major.Minor.Build/Revision... right? Also I don't understand the MajorVersion. It should be 6. This version of Windows is 6. But isn't 0000000f in hexadecimal 15 in decimal? The full version string of this version of Windows, when you launch the Command Prompt for instance, is 6.1.7601. If 7601 is the MinorVersion, then what is 1 and what is 6? And why does the crash dump say 0F?

    Read the article

  • Importing Analysis Services 2008 KPI's in a PerformancePoint scorecard

    - by Colin
    I am trying to import a KPI from Analysis Services into a PerformancePoint Scorecard, and when I do, The Dashboard Designer throws an error: An unknown error has occurred. If the problem persists contact an administrator. There may be additional information in the server application event log. When I examine the event log, I find the following exception: System.IO.FileNotFoundException: Could not load file or assembly 'Microsoft.AnalysisServices, Version=9.0.242.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91' or one of its dependencies. The system cannot find the file specified. File name: 'Microsoft.AnalysisServices, Version=9.0.242.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91' at Microsoft.PerformancePoint.Scorecards.Server.ImportExportHelper.GetImportableAsKpis(IBpm pmService, DataSource asDataSource) at Microsoft.PerformancePoint.Scorecards.Server.PmServer.GetImportableAsKpis(DataSource dataSource) I have found this thread which recommends reinstalling Microsoft ADOMD.NET but the installer for that won't run because the server already has a newer version of the product (The server is running SQL Server Analysis Services 2008 which includes Microsoft.AnalysisServices.AdomdClient.dll version 9.0.3042.0) Anyone have any ideas (short of finding the DLL myself and manually installing it to the GAC)?

    Read the article

  • Java library for HTML analysis

    - by Raj
    Hi, (I've seen similar questions, but I think none of them cater to my specific needs, hence...) I would like to know if there is a Java library for analysis of real-world (read: incomplete, ill-formed) HTML. By analysis, I mean things like: figuring out the most prominent color in an HTML chunk changing that color to some other color (hence, has to support modification of the HTML as well) pruning out unwanted tags fixing up the HTML to result in a well formed HTML snippet Parts of the last two are done by libraries such as Jericho, and jTidy. 'Plugins' on top of these would be great. Thanks in advance!

    Read the article

  • Exclude complete namespace from FxCop code analysis?

    - by hangy
    Is it possible to exclude a complete namespace from all FxCop analysis while still analyzing the rest of the assembly using the SuppressMessageAttribute? In my current case, I have a bunch of classes generated by LINQ to SQL which cause a lot of FxCop issues, and obviously, I will not modify all of those to match FxCop standards, as a lot of those modifications would be gone if I re-generated the classes. I know that FxCop has a project option to suppress analysis on generated code, but it does not seem to recognize the entity and context classes created by LINQ 2 SQL as generated code.

    Read the article

  • SQL Server - Schema/Code Analysis Rules - What would your rules include?

    - by Randy Minder
    We're using Visual Studio Database Edition (DBPro) to manage our schema. This is a great tool that, among the many things it can do, can analyse our schema and T-SQL code based on rules (much like what FxCop does with C# code), and flag certain things as warnings and errors. Some example rules might be that every table must have a primary key, no underscore's in column names, every stored procedure must have comments etc. The number of rules built into DBPro is fairly small, and a bit odd. Fortunately DBPro has an API that allows the developer to create their own. I'm curious as to the types of rules you and your DB team would create (both schema rules and T-SQL rules). Looking at some of your rules might help us decide what we should consider. Thanks - Randy

    Read the article

  • Thoughts on Static Code Analysis Warning CA1806 for TryParse calls

    - by Tim
    I was wondering what people's thoughts were on the CA1806 (DoNotIgnoreMethodResults) Static Code Analysis warning when using FxCop. I have several cases where I use Int32.TryParse to pull in internal configuration information that was saved in a file. I end up with a lot of code that looks like: Int32.TryParse(someString, NumberStyles.Integer, CultureInfo.InvariantCulture, out intResult); MSDN says the default result of intResult is zero if something fails, which is exactly what I want. Unfortunately, this code will trigger CA1806 when performing static code analysis. It seems like a lot of redundant/useless code to fix the errors with something like the following: bool success = Int32.TryParse(someString, NumberStyles.Integer, CultureInfo.InvariantCulture, out intResult); if (!success) { intResult= 0; } Should I suppress this message or bite the bullet and add all this redundant error checking? Or maybe someone has a better idea for handling a case like this? Thanks!

    Read the article

  • Add title to meta analysis forest plot

    - by Timothy Alston
    I am meta-analysing some studies and drawing a forest plot for my results. However I can`t seem to get the forest plot to display the title. An example of my code is: require(meta) parameter1<-metaprop(sm="PLOGIT", event=c(4,16,3,2,10,1,0,2), n=c(90,402,89,29,153,86,21,48), level = 0.95, studlab=c("study 1", "study 2", "study 3", "study 4", "study 5", "study 6", "study 7", "study 8"), title="meta analysis 1") forest(parameter1) When it produces the forest plot, the title "meta analysis 1" is missing. How can I add this in? Thanks in advance, Timothy

    Read the article

< Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >