Search Results

Search found 5666 results on 227 pages for 'cost analysis'.

Page 13/227 | < Previous Page | 9 10 11 12 13 14 15 16 17 18 19 20  | Next Page >

  • PASS Summit Location follow up - result analysis

    - by simonsabin
    I've had a chance to look at the results directly and it is clear that there is a tough choice. On the one hand people are saying that they prefer to have PASS put their money into chapters and things like 24hrs of PASS rather than an event on the east coast. Whilst at the same time almost 50% more people said they would be more likely to attend an East Coast event than a Seattle event, and 60% more said they would be more likley to attend a US Central region event. Whats more 60% said that the summit should be outside of Seattle every other year with only 19% saying it should always stay in Seattle. So clearly there is a huge desire for a non Seattle event. Looking at the other reasons for keeping in Seattle and the big one being that people want Microsoft speakers. More people think its somewhat important of very important that the conference is in walking distance of the hotels and restaurants. Essentially the Q6 questions show an even balance for normal conference, highlighting that they are prepared to travel, not with the family and they want a well laid out conference. Whats very annoying is that the questions, as people have commented, were biased towards certain answers. For instance there was no option about whether people feel its important to have industry leading speakers, MVPs etc at the conference. Only questions about Microsoft speakers. I know survey writing is very difficult to avoid biasing the answers one way or another. There was also no choice to show peoples preference, would people prefer Microsoft speakers or the summit to be held on the East Coast/Central US. I also find it amazing that people prefer hundres of developers rather than the SQLCAT and CSS teams, surely that indicates another issue about a lack of understanding of what the these teams do. All in all it is clear that people showed they want an event outside of Seattle and don't want PASS to be putting money into that instead of into other community activites. I find it suprising that there appears to have been a huge weighting against certain questions which have prioritised them over the huge desire for a PASS summit outside of Seattle. Lets see where we will be in 2013 or maybe they will rethink 2012 who knows.

    Read the article

  • Lower your Applications Infrastructure Cost with Oracle Database 11g

    - by john.brust
    If you missed our live Oracle Database 11g Release 2 webcast last Friday, the replay is available. So, join us for the on demand free Webcast in which Mark Townsend, Vice President of Oracle Database Product Management, discusses how running your Oracle applications (Oracle eBusiness Suite, Oracle's PeopleSoft, and Oracle's Siebel ) on Oracle Database 11g can improve performance and scalability, eliminate downtime, and reduce IT infrastructure costs. In the Q&A segment, Mark answers questions about compression, virtual machines, Oracle Active Data Guard, online application upgrades, and much more. Note: Turn off pop-up blockers if the slides do not advance automatically.

    Read the article

  • Software Usability analysis

    - by Afnan
    i am unable to find the answers to the following questions.Please help me resolve (a) Name quantitative and qualitative techniques for analysing the usability of a software product. (b) Compare the costs and bene?ts of the quantitative techniques. (c) Compare the costs and bene?ts of the qualitative techniques. (d ) If restricted to a single one of these techniques when designing a new online banking system, which would you choose and why?

    Read the article

  • Master Data Management – A Foundation for Big Data Analysis

    - by Manouj Tahiliani
    While Master Data Management has crossed the proverbial chasm and is on its way to becoming mainstream, businesses are being hammered by a new megatrend called Big Data. Big Data is characterized by massive volumes, its high frequency, the variety of less structured data sources such as email, sensors, smart meters, social networks, and Weblogs, and the need to analyze vast amounts of data to determine value to improve upon management decisions. Businesses that have embraced MDM to get a single, enriched and unified view of Master data by resolving semantic discrepancies and augmenting the explicit master data information from within the enterprise with implicit data from outside the enterprise like social profiles will have a leg up in embracing Big Data solutions. This is especially true for large and medium-sized businesses in industries like Retail, Communications, Financial Services, etc that would find it very challenging to get comprehensive analytical coverage and derive long-term success without resolving the limitations of the heterogeneous topology that leads to disparate, fragmented and incomplete master data. For analytical success from Big Data or in other words ROI from Big Data Investments, businesses need to acquire, organize and analyze the deluge of data to make better decisions. There will need to be a coexistence of structured and unstructured data and to maintain a tight link between the two to extract maximum insights. MDM is the catalyst that helps maintain that tight linkage by providing an understanding about the identity, characteristics of Persons, Companies, Products, Suppliers, etc. associated with the Big Data and thereby help accelerate ROI. In my next post I will discuss about patterns for co-existing Big Data Solutions and MDM. Feel free to provide comments and thoughts on above as well as Integration or Architectural patterns.

    Read the article

  • Is PluralSight worth the cost [closed]

    - by B Woods
    I am an looking at a few different supplemental learning courses to help me beef up my skills but unsure which one to use. I am interested in PluralSight because I work for the government and they use all Microsoft technologies for the most part. I like the courses on PluralSight but the price tag is pretty hefty. I am not sure how good the courses are because I cannot view the ones I want so I am a tad bit skeptical. I am sure some of you have PluralSight, do you think its a good investment for your learning or should I just stick with books/internet? Any suggestions?

    Read the article

  • Free tools for SQL Server - Automating Execution Plan Analysis

    - by jchang
    Since this topic is being discussed, I will plug my own tools, SQL Exec Stats and (a little dated) documentation the main capability is cross-referencing index usuage with specific execution plans. another feature is generating execution plans for all stored procedures in a database, along with the index usage cross-reference. There are several sources of execution plans or plan handles, this could be a live trace, a previously saved trace, previously saved sqlplan files, from dm_exec_cached_plans,...(read more)

    Read the article

  • Evidence for automatic browsing - Log file analysis

    - by Nilani Algiriyage
    I'm analyzing web server logs both in Apache and IIS log formats. I want to find the evidence for automatic browsing, like web robots, spiders, bots, etc. I used python robot-detection 0.2.8 for detecting robots in my log files, but I know there may be other robots (automatic programs) which have traversed through the web site but robot-detection can not identify. So I want to ask: Are there any specific clues that can be found in log files that human users do not leave but automated software would? Do they follow a specific navigation pattern? I saw some requests for favicon.ico - does this implicate that it is a automatic browsing?. I found this article and this question with some valuable points.

    Read the article

  • Create SQL Server Analysis Services Partitions using AMO

    When you have SSAS cubes with millions of rows of data, it is very helpful to create partitions. If you have a few cubes you could probably do this manually, but if there are many or if you want to automate this process you should look for smarter solutions such as programming the creation of partitions dynamically. NEW! Never waste another weekend deployingDeploy SQL Server changes and ASP .NET applications fast, frequently, and without fuss, using Deployment Manager, the new tool from Red Gate. Try it now.

    Read the article

  • Download NDepend Analysis Tool

    - by Editor
    NDepend is a tool that simplifies managing a complex .NET code base. Architects and developers can analyze code structure, specify design rules, plan massive refactoring, do effective code reviews and master evolution by comparing different versions of the code. The result is better communication, improved quality, easier maintenance and faster development. NDepend supports the Code Query Language [...]

    Read the article

  • Web Design in 2010 - 2011: Analysis

    As we?re coming to the middle of this year, everyone is trying to analyze the recent trends in web designing and web Development. However, in this article, we?ll see what web designers and developers... [Author: Maryam Naqvi - Web Design and Development - June 09, 2010]

    Read the article

  • Twitter Customer Sentiment Analysis

    - by Liam McLennan
    The breakable toy that I am currently working on is a twitter customer sentiment analyser. It scrapes twitter for tweets relating to a particular organisation, applies a machine learning algorithm to determine if the content of tweet is positive or negative, and generates reports of the sentiment data over time, correlated to dates, events and news feeds. I’m having lots of fun building this, but I would also like to learn if there is a market for quantified sentiment data. So that I can start to show people what I have in mind I have created a mockup of the simplest and most important report. It shows customer sentiment over time, with important events highlighted. As the user moves their mouse to the right (forward in time) the source data area scrolls up to display the tweets from that time. The tweets are colour coded based on sentiment rating. After I started working on this project I discovered that a team of students have already built something similar. It is a lot of fun to enter your employers name and see what it says.

    Read the article

  • Deep-protocol analysis of UNIX networks

    <b>developerWorks: </b>"Whether you are monitoring your network to identify performance issues, debugging an application, or have found an application on your network that you do not recognize, occasionally you need to look deep into the protocols being used on your UNIX network to understand what they are doing."

    Read the article

  • How to get audio spectrum analysis?

    - by Mrwolfy
    I need to find or create a tool that analyzes the audio spectrum of a sound file (like a .wav or .mp3). I need to output the "volume" or power of x number of frequency bands and output the data as text. This will be used to produce a visualization, a graphic equalizer like you'd see on a stereo. I am currently looking at python to do it. My question is are there some tools out there that would do this (signal processing), like math works or others? I don't have any experience with them so any advice would be appreciated.

    Read the article

  • Free tools for SQL Server - Automating Execution Plan Analysis

    - by jchang
    Since this topic is being discussed, I will plug my own tools, SQL Exec Stats and (a little dated) documentation the main capability is cross-referencing index usuage with specific execution plans. another feature is generating execution plans for all stored procedures in a database, along with the index usage cross-reference. There are several sources of execution plans or plan handles, this could be a live trace, a previously saved trace, previously saved sqlplan files, from dm_exec_cached_plans,...(read more)

    Read the article

  • The Cost of Cheap Website Content

    There can be no doubt that SEO/SEM experts are now firmly of a mind that website content is truly king, when it comes to generating organic traffic. However, some webmasters prefer to pay very little for their site content, regardless of the fact that the quality will be lower.

    Read the article

  • SEO Secrets - Analysis of the Competition

    Probably the most common mistake internet marketers make is not to investigate the strength of the competition. In their eagerness to get their business rolling, they forget to objectively assess their chances of success against their competition. Heed the following warning. If the competition is too strong--because they have gained a strong dominance over your niche--your best option will actually be to find another one which is not "ruled over" by competitors.

    Read the article

  • Why do programming language (open) standards cost money?

    - by fish
    Isn't it counter-productive to ask for 384 Swiss franks for C11 or 352 Swiss franks for C++11, if the aim is to make the standards widely adopted? Please note, I'm not ranting at all, and I'm not against paying; I would like to understand the rationale behind setting the prices as such, especially knowing that ISO is network of national standard institutes (i.e. funded by governments). And I also doubt that these prices would generate enough income to fund an organization like that, so there must be another reason.

    Read the article

  • Open Source Analysis

    - by BluFire
    There are a lot of code in open source projects, looking at all of the code is time consuming and can be confusing to a novice like me. Are there any sections of open-source projects that should be focused on? What should I focus on when I look at code? I'm asking this in general because if I ask this specifically, the question will only apply in one or two projects rather than an entire group of projects ranging in different types of games and difficulty.

    Read the article

  • Maintaining SQL Server default trace historical events for analysis and reporting

    I often see questions online where someone wants to find out who started a trace, when tempdb last had an autogrow event, or when the last full backup for master occurred. Since these and other events are captured by the default trace, but the default trace only keeps five 20MB rollover files by default. This means that the event you are after may no longer be there, depending on how long ago it was and how busy your server happens to be. Unfortunately, people often need to find this information well after the fact.

    Read the article

  • High-Powered Sites for low Cost

    - by HighAltitudeCoder
    Ahh, I am experiencing the intimidation of my very first post - visible by the whole world. Ok, here goes.   This first post is nothing exceptional.  It is simply a recommendation based (fittingly, I suppose) upon the job search you may be gearing up for.  I find myself in this very situation right now.  And, I will take my own recommendation after posting this entry. Job-Seekers: To the left you will notice two links under "Recommended Learning".  I have found these links to be invaluable when it comes to re-tooling, re-familiarizing, or otherwise resharping my skills when looking for that next job. Often, you will find job-postings with the text, usually posted after a laborious list of qualifications indicating the company's desire to hire candidates who know what they are doing: "...Looking for a candidate who can hit the ground running...".  The interesting thing about this post to me is I've encountered many individuals who, after speaking and working with them for some time, I've realized are perfectly capable of hitting the ground running - and FAST.  But what if they speed off in the wrong direction? The next time you spearhead a major task in your job, ask yourself: Am I headed in the wrong direction?  There are many ways to do this.  In fact, I've found in this new field there are more tempting ways to steer your project in the wrong direction than there are good ones.  I don't want to suggest that every one of my posts will fall into the "right direction" category, however I do think a healthy dose of introspection of the pros and cons will always be beneficial before you set off. That said, allow me to expound on the previously mentioned links. These web sites are invaluable.  They demonstrate the capabilities of existing as well as new and upcoming tools available in several IDE's.  I've viewed many tutorials in LearnVisualStudio.NET, and only one or two so far in TrainingSpot, however I've been delighted in their simplicity and straightforward approach to proper usage of the particular tool or concept being discussed.  They have not (so far in my experience) demonstrated ways in which to use the tools that become cumbersome, impractical, or error-prone. Each website has step-by-step videos that can be paused, replayed, and most importantly, they are done in real time.  As the author is typing, the viewer gets to experience the coding experience from a first-person perspective, including syntax errors, unexpected behaviors, IDE setup idiosyncracies, everything.  A subtle value I've gained from these videos is that a certain degree of confusion and introspection is normal when working with new tools and exploring new paths.  They (as well as your own experience) are not to be feared, but enjoyed.  I highly recommend them. Good work, guys!

    Read the article

< Previous Page | 9 10 11 12 13 14 15 16 17 18 19 20  | Next Page >