Search Results

Search found 3717 results on 149 pages for 'escape analysis'.

Page 8/149 | < Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >

  • Using ASCMD to run command line scripts for SQL Server Analysis Services

    Sometimes it would be helpful to run scripts from a command line for Analysis Services. This would be useful for things like creating backups, processing data or running other tasks. Is there a command line tool like sqlcmd for multidimensional databases and Data Mining? What are your servers really trying to tell you? Find out with new SQL Monitor 3.0, an easy-to-use tool built for no-nonsense database professionals.For effortless insights into SQL Server, download a free trial today.

    Read the article

  • Keyword Analysis - What You Need For Your SEO Strategies

    One of the preliminary things you can do towards successful internet marketing is to perform a thorough keyword analysis. If you have any desire to use your website for the business to grow, then the first thing you should do is implement a comprehensive internet marketing technique.

    Read the article

  • xVelocity engines compared: VertiPaq vs ColumnStore #ssas #vertipaq #xvelocity #sql #tabular

    - by Marco Russo (SQLBI)
    During the last months I and Alberto worked in several projects using Analysis Services Tabular and we had to face real world issues, such as complex queries, large data volume, frequent data updates and so on. Sometime we faced the challenge of comparing Tabular performance with SQL Server. It seemed a non-sense, because even if the same core xVelocity technology is implemented in both products (SQL Server 2012 uses ColumnStore indexes, whereas Analysis Services 2012 uses VertiPaq), we initially assumed that the better optimization for the in-memory engine used by Analysis Services would have been always better than SQL Server. However, we discovered several important things: Processing time might be different and having data on SQL Server could make ColumnStore way faster for processing. Partitioning in SQL Server might be much more effective for query performance than Analysis Services. A single query can scale easily on more processor on SQL Server, whereas in Analysis Services the formula engine is single-threaded and could be a bottleneck for certain queries. In case of a large workload with many concurrent users, storage engine cache in Analysis Services could be a big advantage over SQL Server, especially for scalability As you can see, these considerations are not always obvious and you might be tempted to make other assumptions based on these information. Well, don’t do that. Before anything else, read the whitepaper VertiPaq vs ColumnStore Comparison written by Alberto Ferrari. Then, measure your workload. Finally, make some conclusion. But don’t make too many assumptions. You might be wrong, as we did at the beginning of this journey.

    Read the article

  • How do I escape spaces in command line in Windows without using quotation marks?

    - by David
    For example what is the alternative to this command without quotation marks: CD "c:\Documents and Settings" The full reason I don't want to use quotation marks is that this command DOES work: SVN add mypathname\*.* but this command DOES NOT work : SVN add "mypathname\*.*" The problem being when I change mypathname for a path with spaces in it I need to quote the whole thing. For example: SVN add "c:\Documents and Settings\username\svn\*.*" But when I try this I get the following error message: svn: warning: 'c:\Documents and Settings\username\svn\*.*' not found

    Read the article

  • Cannot escape character '@' in fstab file

    - by ubuntico
    I want to attach remote FTP directory as a local directory in my system. I will use curlftpfs line in the fstab file , but to login I have to pass my user name and password. The user name has a special character (@) and it needs to be escaped via octal ascii code. The octal ascii for '@' is 100. But when I try to enter the following into the FSTAB file, curlftpfs#myself\100myself.com:ftpPassword@ftp://ftp.mysite.com /mnt/somedir I get an error saying Error connecting to ftp: Couldn't resolve host 'myself.com:ftpPassword@ftp://ftp.mysite.com' The fstab does not recognize escaped symbol @ (\100) and thinks that the FTP site should start immediately after the @ symbol (just like I haven't escaped it). Can someone help? Why I cannot escape @, when it can be done for space character, for example?

    Read the article

  • snort analysis of wireshark capture

    - by Ben Voigt
    I'm trying to identify trouble users on our network. ntop identifies high traffic and high connection users, but malware doesn't always need high bandwidth to really mess things up. So I am trying to do offline analysis with snort (don't want to burden the router with inline analysis of 20 Mbps traffic). Apparently snort provides a -r option for this purpose, but I can't get the analysis to run. The analysis system is gentoo, amd64, in case that makes any difference. I've already used oinkmaster to download the latest IDS signatures. But when I try to run snort, I keep getting the following error: % snort -V ,,_ -*> Snort! <*- o" )~ Version 2.9.0.3 IPv6 GRE (Build 98) x86_64-linux '''' By Martin Roesch & The Snort Team: http://www.snort.org/snort/snort-team Copyright (C) 1998-2010 Sourcefire, Inc., et al. Using libpcap version 1.1.1 Using PCRE version: 8.11 2010-12-10 Using ZLIB version: 1.2.5 %> snort -v -r jan21-for-snort.cap -c /etc/snort/snort.conf -l ~/snortlog/ (snip) 273 out of 1024 flowbits in use. [ Port Based Pattern Matching Memory ] +- [ Aho-Corasick Summary ] ------------------------------------- | Storage Format : Full-Q | Finite Automaton : DFA | Alphabet Size : 256 Chars | Sizeof State : Variable (1,2,4 bytes) | Instances : 314 | 1 byte states : 304 | 2 byte states : 10 | 4 byte states : 0 | Characters : 69371 | States : 58631 | Transitions : 3471623 | State Density : 23.1% | Patterns : 3020 | Match States : 2934 | Memory (MB) : 29.66 | Patterns : 0.36 | Match Lists : 0.77 | DFA | 1 byte states : 1.37 | 2 byte states : 26.59 | 4 byte states : 0.00 +---------------------------------------------------------------- [ Number of patterns truncated to 20 bytes: 563 ] ERROR: Can't find pcap DAQ! Fatal Error, Quitting.. net-libs/daq is installed, but I don't even want to capture traffic, I just want to process the capture file. What configuration options should I be setting/unsetting in order to do offline analysis instead of real-time capture?

    Read the article

  • Attend free workshop on 3/16: Architecture Analysis Patterns

    On Tuesday, 3/16/2010, Headspring is offering another free monthly workshop.  This month, I am leading the workshop, and the topic is: Architecture Analysis Patterns: How to reason about the structure of an application Layering, a fundamental concept of software architecture: Layer helps to separate dependencies and to decouple concerns. Most of the industry does layering in name only. It's lip service. In 23 slides and accompanying commentary, we will explore the fundamental concept of separating...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • SQL Server Analysis Services 2005 crash when disk is full?

    - by squillman
    One of our SQL boxes ran itself out of disk space last night. This particular server has both the database engine and analysis services on it. Database engine was not happy about having no disk space on the volume where all the data files are, but analysis services just plain died. At least, the only thing I have to blame is the full volume. Has anyone experienced a SSAS that they've been able to directly tie to no disk space? I've got nothing else in the SQL or event logs to blame...

    Read the article

  • Free SEO Analysis using IIS SEO Toolkit

    In my spare time Ive been thinking about new ideas for the SEO Toolkit, and it occurred to me that rather than continuing trying to figure out more reports and better diagnostics against some random fake sites, that it could be interesting to ask openly for anyone that is wanting a free SEO analysis report of your site and test drive some of it against real sites. So what is in it for you, I will analyze your site to look for common SEO errors, I will create a digest of actions to do and other...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Free SEO Analysis using IIS SEO Toolkit

    In my spare time Ive been thinking about new ideas for the SEO Toolkit, and it occurred to me that rather than continuing trying to figure out more reports and better diagnostics against some random fake sites, that it could be interesting to ask openly for anyone that is wanting a free SEO analysis report of your site and test drive some of it against real sites. So what is in it for you, I will analyze your site to look for common SEO errors, I will create a digest of actions to do and other...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • best way to go about cost-benefit analysis on hardware

    - by Michael
    I'm looking to build a low-end computational server (my jargon in this field is especially limited so if someone can state that better please change that to meet jargon). I'm basically running computational fluid dynamics programs, large matrix computations and bioinformatics code. What would be the best way to approach cost/benefit analysis on what to put in the system? Perhaps even more general: How does one approach cost/benefit analysis on hardware theoretically (doing the analysis before building the machine)?

    Read the article

  • VMMap - awesome memory analysis tool

    VMMap is a process virtual and physical memory analysis utility. It shows a breakdown of a process's committed virtual memory types as well as the amount of physical memory (working set) assigned by the operating system to those types. Besides graphical representations of memory usage, VMMap also shows summary information and a detailed process memory map. Powerful filtering and refresh capabilities allow you to identify the sources of process memory usage and the memory cost of application features. Besides flexible views for analyzing live processes, VMMap supports the export of data in multiple forms, including a native format that preserves all the information so that you can load back in. It also includes command-line options that enable scripting scenarios. VMMap is the ideal tool for developers wanting to understand and optimize their application's memory resource usage. span.fullpost {display:none;}

    Read the article

  • VMMap - awesome memory analysis tool

    VMMap is a process virtual and physical memory analysis utility. It shows a breakdown of a process's committed virtual memory types as well as the amount of physical memory (working set) assigned by the operating system to those types. Besides graphical representations of memory usage, VMMap also shows summary information and a detailed process memory map. Powerful filtering and refresh capabilities allow you to identify the sources of process memory usage and the memory cost of application features. Besides flexible views for analyzing live processes, VMMap supports the export of data in multiple forms, including a native format that preserves all the information so that you can load back in. It also includes command-line options that enable scripting scenarios. VMMap is the ideal tool for developers wanting to understand and optimize their application's memory resource usage. span.fullpost {display:none;}

    Read the article

  • Do I need to retain Sharepoint usage analysis log files

    - by dunxd
    Our Sharepoint installation currently has 30Gb of Usage Analysis Log file - these date back about six months. I have configured Sharepoint to do Usage Analysis Processing every night, so I am wondering whether I need to keep these files for so long. Sharepoint doesn't seem to clean up these files automatically - I think six months ago I had to clear out logs due to disk space issues. So my question is, do I need to retain these files in order to get decent usage analysis reports, or can I delete them as soon as the usage analysis processing has completed?

    Read the article

  • BigQuery: Simple example of a data collection and analysis pipeline + Your questions

    BigQuery: Simple example of a data collection and analysis pipeline + Your questions Join Michael Manoochehri and Ryan Boyd live to talk about Google BigQuery. We'll give an overview of how we're using our cars, phones, App Engine and BigQuery to collect and analyze data. We'll be discussing our trusted tester feature which allows analyzing data from the App Engine datastore. We'll also review some of the more interesting questions from Stack Overflow and take questions via Google Moderator. From: GoogleDevelopers Views: 250 16 ratings Time: 26:53 More in Science & Technology

    Read the article

  • Introduction to the SQL Server Analysis Services Neural Network Data Mining Algorithm

    In data mining and machine learning circles, the neural network is one of the most difficult algorithms to explain. Fortunately, SQL Server Analysis Services allows for a simple implementation of the algorithm for data analytics. Dallas Snider explains 24% of devs don’t use database source control – make sure you aren’t one of themVersion control is standard for application code, but databases haven’t caught up. So what steps can you take to put your SQL databases under version control? Why should you start doing it? Read more to find out…

    Read the article

  • 1000 most visited sites on the web: A Google Analysis

    Google has released an analysis on the 1000 most visited sites on the web. Considering that we own/operate 3 of the top 10 sites and has a significant interest in Facebook, plus this recent report that states that Microsoft employees are the most social-media-savvy will go to great lengths to show how well we can operate in our cloud and social media integration and collaboration strategies. William Tay 2000-2010 | Swinging Technologist http://www.softwaremaker.net/blog...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • What is the best way/tool to analyze raw data(network stats) from Simulation?

    - by user90500
    After running a simulation(using a simulator(QualNet)) of a simulated network I end up with ip stats stored in a database, I then extract the data to a csv file So now I have 750mb of raw network stats(time stamp, packet id, source ip, source port, protocol, etc). What are the common ways of analyzing large amounts of data like above, if you want to know things like packet loss, throughput, delay, congestion, etc.

    Read the article

  • Are there any C++ tools that detect misuse of static_cast, dynamic_cast, and reinterpret_cast?

    - by chrisp451
    The answers to the following question describe the recommended usage of static_cast, dynamic_cast, and reinterpret_cast in C++: http://stackoverflow.com/questions/332030/when-should-static-cast-dynamic-cast-and-reinterpret-cast-be-used Do you know of any tools that can be used to detect misuse of these kinds of cast? Would a static analysis tool like PC-Lint or Coverity Static Analysis do this? The particular case that prompted this question was the inappropriate use of static_cast to downcast a pointer, which the compiler does not warn about. I'd like to detect this case using a tool, and not assume that developers will never make this mistake.

    Read the article

  • VS2010 Code Analysis, any way to automatically fix certain warnings?

    - by JL
    I must say I really like the new code analysis with VS 2010, I have a lot of areas in my code where I am not using CultureInfo.InvariantCultureand code analysis is warming me about this. I am pretty sure I want to use CultureInfo.InvariantCulturewhere ever code analysis has detected it is missing on Convert.ToString operations. Is there anyway to get VS to automatically fix warnings of this type?

    Read the article

  • Do you recommend Enabling Code Analysis for C/C++ on Build?

    - by brickner
    I'm using Visual Studio 2010, and in my C++/CLI project there are two Code Analysis settings: Enable Code Analysis on Build Enable Code Analysis for C/C++ on Build My question is about the second setting. I've enabled it and it takes a long time to run and it doesn't find much. Do you recommend enabling this feature? Why?

    Read the article

  • Scoring/analysis of Subjective testing for skills assessment

    - by ChrisBint
    I am lucky in the sense that I have been given the opportunity to be a 'Technical Troubleshooter' for our offshore development team. While I am confident and capable of dealing with most issues, I have come across something that I am not. Based on initial discussions with various team members both on and offshore, a requirement for a 'repeatable, consistent' skills assessment has been identified. In my opinion, the best way to achieve this would be a combination of objective and subjective tests. The former normally being an initial online skills assessment on various subjects, for example General C#, WCF and MVC. The latter being a technical test where the candidate would need to solve various problems and (hopefully) explain the thought processes involved with the solution whilst doing so. Obviously, the first method is consistent, repeatable and extremely accurate. The second is always going to be subjective and based on the approach, the solution (or possibly not) and other factors. The 'scoring' of this is also going to be down to the experience and skills of the assessor and this is where my problem lies; The person that is expected to be the assessor initially (me) has no experience. The people that will ultimately continue this process for other people will never remain the same due to project constraints and internal reasons, this changes the baseline for comparison. I am not aware of any suitable system that can be classed as consistent and repeatable for subjective tests with the 2 factors above, let alone if those did not exist. So anyway, I have to present a plan that will ultimately generate a skills/gap analysis and it is unlikely that I will be able to use an objective method (budget constraints most likely reason). The only option left is the subjective methods and the issues above. Does anyone have any suggestions for an approach that may tick all the boxes?

    Read the article

  • WolframAlpha Can Now Do In-depth Analysis of Your Facebook Account

    - by Jason Fitzpatrick
    If you’re a big fan of WolframAlpha’s ability to crunch the numbers on just about anything–and we certainly are–you’ll likely be just as delighted as we were to watch it massage the data from your Facebook account. Find out your most liked, discussed, and shared posts, see your Facebook habits, and other neat trends. I unleashed it on my account this morning, not sure what to expect from the results. Within the results tabulation WolframAlpha provided me with all sorts of neat data break downs. I now know exactly how many days it is to my next birthday, the composition of my aggregate posting habits (how many posts are status updates, links, or photos), the time of day when I do the most posting (and what the composition of those posts is), and my average post length. I also know my most liked post and my most commented on post. It will even crunch the numbers on your network of friends (60.6% of my friends are married, for example). By far one of the more interesting data analysis it does on the friendship data, however, is organizing all your friends into relationship clusters so you can see who in your Facebook network is friends with other people in your Facebook network. The service from WolframAlpha is free: simply visit the WolframAlpha search portal and type in “Facebook report” to start the process. You’ll be prompted to create a WolframAlpha account if you don’t have one and to authorize the WolframAlpha Facebook app to access your data. Your Facebook data is cached to your WolframAlpha account for one hour in order to crunch the numbers and display the results. WolframAlpha HTG Explains: How Windows Uses The Task Scheduler for System Tasks HTG Explains: Why Do Hard Drives Show the Wrong Capacity in Windows? Java is Insecure and Awful, It’s Time to Disable It, and Here’s How

    Read the article

< Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >