Search Results

Search found 11362 results on 455 pages for 'big o analysis'.

Page 12/455 | < Previous Page | 8 9 10 11 12 13 14 15 16 17 18 19  | Next Page >

  • Free SEO Analysis using IIS SEO Toolkit

    In my spare time Ive been thinking about new ideas for the SEO Toolkit, and it occurred to me that rather than continuing trying to figure out more reports and better diagnostics against some random fake sites, that it could be interesting to ask openly for anyone that is wanting a free SEO analysis report of your site and test drive some of it against real sites. So what is in it for you, I will analyze your site to look for common SEO errors, I will create a digest of actions to do and other...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Oracle Data Warehouse and Big Data Magazine MAY Edition for Customers + Partners

    - by KLaker
    Follow us on The latest edition of our monthly data warehouse and big data magazine for Oracle customers and partners is now available. The content for this magazine is taken from the various data warehouse and big data Oracle product management blogs, Oracle press releases, videos posted on Oracle Media Network and Oracle Facebook pages. Click here to view the May Edition Please share this link http://flip.it/fKOUS to our magazine with your customers and partners This magazine is optimized for display on tablets and smartphones using the Flipboard App which is available from the Apple App store and Google Play store

    Read the article

  • best way to go about cost-benefit analysis on hardware

    - by Michael
    I'm looking to build a low-end computational server (my jargon in this field is especially limited so if someone can state that better please change that to meet jargon). I'm basically running computational fluid dynamics programs, large matrix computations and bioinformatics code. What would be the best way to approach cost/benefit analysis on what to put in the system? Perhaps even more general: How does one approach cost/benefit analysis on hardware theoretically (doing the analysis before building the machine)?

    Read the article

  • How Big Data and Social Won the Election

    - by Mike Stiles
    The story of big data’s influence on the outcome of the US Presidential election is worth a good look, because a) it’s a harbinger of things to come, and b) it’s an example of similar successes available to any enterprise seriously resourcing integrated big data, modeling, and data-driven execution on all assets, including social. Obama campaign manager Jim Messina fielded a data and analytics brain trust 5 times larger than 2008. At that time, there were numerous databases from various sources, few of them talking to each other. This time, the mission was to be metrics-centered and measure everything measurable, and in context with all the other data. Big data showed them exactly what they needed to know and told them what to do about it. It showed them women 40-49 on the west coast would donate big money if they got to eat with George Clooney. Women on the east coast would pony up to hang out with Sarah Jessica Parker. Extensive daily modeling showed them what kinds of email appeals, from who, and to whom, would prove most successful in raising cash, recruiting volunteers, and getting out the vote. Swing state voters were profiled and approached with more customized targeting that at any time in history. Ads were purchased on specific shows watched by the targets, increasing efficiency 14% over traditional media buys. For all the criticism of the candidate’s focus on appearing on comedy and entertainment shows, and local radio morning shows, that’s where the data sent them to reach the voters most likely to turn out for them. And then there was social. Again, more than in any other election, Facebook was used for virtual, highly efficient door-to-door canvasing. Facebook fans got pictures of friends in swing states and were asked to encourage them to act. Using that approach, 1 in 5 peer-to-peer appeals led to the desired action. Assumptions, gut, intuition, campaign experience, all took a backseat to strategy shifts solidly backed up by data. Zeroing in on demographics likely to back the President and tracking their mood daily literally changed the voter landscape. The Romney team watched Obama voters appear seemingly out of thin air. One Obama campaign aide said, “We ran the election 66,000 times every night.” Which brings us to your organization. If you’re starting to feel like the battle-cry of “but this is the way we’ve always done it” is starting to put you in an extremely vulnerable position, you’re right. Social has become a key communication tool of the 21st century. Failing to use it, or failing to invest in a deep understanding of who your customers and prospects are so the content you post there will achieve desired actions and results, will leave you waking up one morning wondering, “What happened?”@mikestilesPhoto stock.xchng

    Read the article

  • VMMap - awesome memory analysis tool

    VMMap is a process virtual and physical memory analysis utility. It shows a breakdown of a process's committed virtual memory types as well as the amount of physical memory (working set) assigned by the operating system to those types. Besides graphical representations of memory usage, VMMap also shows summary information and a detailed process memory map. Powerful filtering and refresh capabilities allow you to identify the sources of process memory usage and the memory cost of application features. Besides flexible views for analyzing live processes, VMMap supports the export of data in multiple forms, including a native format that preserves all the information so that you can load back in. It also includes command-line options that enable scripting scenarios. VMMap is the ideal tool for developers wanting to understand and optimize their application's memory resource usage. span.fullpost {display:none;}

    Read the article

  • Oracle Social Analytics with the Big Data Appliance

    - by thegreeneman
    Found an awesome demo put together by one of the Oracle NoSQL Database partners, eDBA, on using the Big Data Appliance to do social analytics. In this video, James Anthony is showing off the BDA, Hadoop, the Oracle Big Data Connectors and how they can be used and integrated with the Oracle Database to do an end-to-end sentiment analysis leveraging twitter data.   A really great demo worth the view. 

    Read the article

  • Do I need to retain Sharepoint usage analysis log files

    - by dunxd
    Our Sharepoint installation currently has 30Gb of Usage Analysis Log file - these date back about six months. I have configured Sharepoint to do Usage Analysis Processing every night, so I am wondering whether I need to keep these files for so long. Sharepoint doesn't seem to clean up these files automatically - I think six months ago I had to clear out logs due to disk space issues. So my question is, do I need to retain these files in order to get decent usage analysis reports, or can I delete them as soon as the usage analysis processing has completed?

    Read the article

  • VMMap - awesome memory analysis tool

    VMMap is a process virtual and physical memory analysis utility. It shows a breakdown of a process's committed virtual memory types as well as the amount of physical memory (working set) assigned by the operating system to those types. Besides graphical representations of memory usage, VMMap also shows summary information and a detailed process memory map. Powerful filtering and refresh capabilities allow you to identify the sources of process memory usage and the memory cost of application features. Besides flexible views for analyzing live processes, VMMap supports the export of data in multiple forms, including a native format that preserves all the information so that you can load back in. It also includes command-line options that enable scripting scenarios. VMMap is the ideal tool for developers wanting to understand and optimize their application's memory resource usage. span.fullpost {display:none;}

    Read the article

  • Google I/O 2012 - Crunching Big Data with BigQuery

    Google I/O 2012 - Crunching Big Data with BigQuery Jordan Tigani, Ryan Boyd Google BigQuery is a data analysis tool born from Google internal technologies. It enables developers to analyze terabyte data sets in seconds using a RESTful API. This session will dive into best practices for getting fast answers to business questions. We'll provide insight into how we process queries under the hood and how to construct SQL queries for complex analysis. For all I/O 2012 sessions, go to developers.google.com From: GoogleDevelopers Views: 1 0 ratings Time: 01:03:04 More in Science & Technology

    Read the article

  • BigQuery: Simple example of a data collection and analysis pipeline + Your questions

    BigQuery: Simple example of a data collection and analysis pipeline + Your questions Join Michael Manoochehri and Ryan Boyd live to talk about Google BigQuery. We'll give an overview of how we're using our cars, phones, App Engine and BigQuery to collect and analyze data. We'll be discussing our trusted tester feature which allows analyzing data from the App Engine datastore. We'll also review some of the more interesting questions from Stack Overflow and take questions via Google Moderator. From: GoogleDevelopers Views: 250 16 ratings Time: 26:53 More in Science & Technology

    Read the article

  • SQL Server 2012 disponible en version finale : AlwaysOn, Big Data, Power View, Microsoft tient ses promesses

    SQL Server 2012 disponible en version finale AlwaysOn, Big Data, Power View, la plateforme de gestion et d'analyse d'information de Microsoft tient ses promesses Mise à jour du 03/04/2012 Comme l'avait promis Microsoft, la version finale de SQL Server 2012 est disponible depuis le 1er avril, mais a été annoncée officiellement hier. La plateforme de gestion et d'analyse d'information de Microsoft a été conçue pour être l'environnement de référence des applications critiques d'entreprise, offrir une solution décisionnelle plus complète intégrant le Big Data et permettre une meilleure connexion avec le Cloud. ...

    Read the article

  • Introduction to the SQL Server Analysis Services Neural Network Data Mining Algorithm

    In data mining and machine learning circles, the neural network is one of the most difficult algorithms to explain. Fortunately, SQL Server Analysis Services allows for a simple implementation of the algorithm for data analytics. Dallas Snider explains 24% of devs don’t use database source control – make sure you aren’t one of themVersion control is standard for application code, but databases haven’t caught up. So what steps can you take to put your SQL databases under version control? Why should you start doing it? Read more to find out…

    Read the article

  • 1000 most visited sites on the web: A Google Analysis

    Google has released an analysis on the 1000 most visited sites on the web. Considering that we own/operate 3 of the top 10 sites and has a significant interest in Facebook, plus this recent report that states that Microsoft employees are the most social-media-savvy will go to great lengths to show how well we can operate in our cloud and social media integration and collaboration strategies. William Tay 2000-2010 | Swinging Technologist http://www.softwaremaker.net/blog...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • What is the best way/tool to analyze raw data(network stats) from Simulation?

    - by user90500
    After running a simulation(using a simulator(QualNet)) of a simulated network I end up with ip stats stored in a database, I then extract the data to a csv file So now I have 750mb of raw network stats(time stamp, packet id, source ip, source port, protocol, etc). What are the common ways of analyzing large amounts of data like above, if you want to know things like packet loss, throughput, delay, congestion, etc.

    Read the article

  • Are there any C++ tools that detect misuse of static_cast, dynamic_cast, and reinterpret_cast?

    - by chrisp451
    The answers to the following question describe the recommended usage of static_cast, dynamic_cast, and reinterpret_cast in C++: http://stackoverflow.com/questions/332030/when-should-static-cast-dynamic-cast-and-reinterpret-cast-be-used Do you know of any tools that can be used to detect misuse of these kinds of cast? Would a static analysis tool like PC-Lint or Coverity Static Analysis do this? The particular case that prompted this question was the inappropriate use of static_cast to downcast a pointer, which the compiler does not warn about. I'd like to detect this case using a tool, and not assume that developers will never make this mistake.

    Read the article

  • VS2010 Code Analysis, any way to automatically fix certain warnings?

    - by JL
    I must say I really like the new code analysis with VS 2010, I have a lot of areas in my code where I am not using CultureInfo.InvariantCultureand code analysis is warming me about this. I am pretty sure I want to use CultureInfo.InvariantCulturewhere ever code analysis has detected it is missing on Convert.ToString operations. Is there anyway to get VS to automatically fix warnings of this type?

    Read the article

  • Do you recommend Enabling Code Analysis for C/C++ on Build?

    - by brickner
    I'm using Visual Studio 2010, and in my C++/CLI project there are two Code Analysis settings: Enable Code Analysis on Build Enable Code Analysis for C/C++ on Build My question is about the second setting. I've enabled it and it takes a long time to run and it doesn't find much. Do you recommend enabling this feature? Why?

    Read the article

  • Scoring/analysis of Subjective testing for skills assessment

    - by ChrisBint
    I am lucky in the sense that I have been given the opportunity to be a 'Technical Troubleshooter' for our offshore development team. While I am confident and capable of dealing with most issues, I have come across something that I am not. Based on initial discussions with various team members both on and offshore, a requirement for a 'repeatable, consistent' skills assessment has been identified. In my opinion, the best way to achieve this would be a combination of objective and subjective tests. The former normally being an initial online skills assessment on various subjects, for example General C#, WCF and MVC. The latter being a technical test where the candidate would need to solve various problems and (hopefully) explain the thought processes involved with the solution whilst doing so. Obviously, the first method is consistent, repeatable and extremely accurate. The second is always going to be subjective and based on the approach, the solution (or possibly not) and other factors. The 'scoring' of this is also going to be down to the experience and skills of the assessor and this is where my problem lies; The person that is expected to be the assessor initially (me) has no experience. The people that will ultimately continue this process for other people will never remain the same due to project constraints and internal reasons, this changes the baseline for comparison. I am not aware of any suitable system that can be classed as consistent and repeatable for subjective tests with the 2 factors above, let alone if those did not exist. So anyway, I have to present a plan that will ultimately generate a skills/gap analysis and it is unlikely that I will be able to use an objective method (budget constraints most likely reason). The only option left is the subjective methods and the issues above. Does anyone have any suggestions for an approach that may tick all the boxes?

    Read the article

  • NRF Big Show 2011 -- Part 3

    - by David Dorf
    I'm back from the NRF show having been one of the lucky people who's flight was not canceled. The show was very crowded with a reported 20% increase in attendance and everyone seemed in high spirits. After two years of sluggish retail sales, things are really picking up and it was reflected in everyone's mood. The pop-up Disney Store in the Oracle booth was great and attracted lots of interest in their mobile POS. I know many attendees visited the Disney Store in Times Square to see the entire operation. It's an impressive two-story store that keeps kids engaged. The POS demonstration station, where most of our innovations were demoed, was always crowded. Unfortunately most of the demos used WiFi and the signals from other booths prevented anything from working reliably. Nevertheless, the demo team did an excellent job walking people through the scenarios and explaining how shopping is being impacted by mobile, analytics, and RFID. Big Show Links Disney uncovers its store magic Top 10 Things You Missed at the NRF Big Show 2011 Oracle Retail Stores Innovation Station at NRF Big Show 2011 (video) The buzz of the show was again around mobile solutions. Several companies are creating mobile POS using the iPod Touch, including integrations to Oracle POS for the following retailers: Disney Stores with InfoGain Victoria's Secret with InfoGain Urban Outfitters with Starmount The Gap with Global Bay Keeping with the mobile theme, the NRF release a revised version of their Mobile Blueprint at NRF. It will be posted to the NRF site very soon. The alternate payments section had a major rewrite that provides a great overview and proximity and remote payment technologies. NRF Mobile Blueprint Links New mobile blueprint provides fresh insights NRF Mobile Blueprint 2011 (slides) I hope to do some posts on some of the interesting companies I spoke with in the coming weeks.

    Read the article

< Previous Page | 8 9 10 11 12 13 14 15 16 17 18 19  | Next Page >