Search Results

Search found 3339 results on 134 pages for 'frequency analysis'.

Page 62/134 | < Previous Page | 58 59 60 61 62 63 64 65 66 67 68 69  | Next Page >

  • SQL Intersection Conference, Las Vegas MGM Grand 10-13 November 2014

    - by Paul White
    I am very pleased to announce that I will be speaking at the SQL Intersection conference in Las Vegas again this year. This time around, I am giving a full-day workshop, "Mastering SQL Server Execution Plan Analysis" as well as a two-part session, "Parallel Query Execution" during the main conference. The workshop is a pre-conference event, held on Sunday 9 November (straight after this year's PASS Summit). Being on Sunday gives you the whole Monday off to recover and before the...(read more)

    Read the article

  • Claiming the provisioned storage

    - by gita
    I created a ubuntu server vm with 64GB provisioned storage. I remember that I specified 30GB to be used for the vm. When I do df -h, I get Filesystem Size Used Avail Use% Mounted on /dev/mapper/analysis--db-root<br/> 28G 25G 904M 97% / udev 2.0G 4.0K 2.0G 1% /dev tmpfs 793M 228K 793M 1% /run none 5.0M 0 5.0M 0% /run/lock none 2.0G 0 2.0G 0% /run/shm /dev/sda1 228M 45M 171M 21% /boot The disk is almost full, how can I use my other 30GB from the provisioned storage?

    Read the article

  • Closing the gap between strategy and execution with Oracle Business Intelligence 11g

    - by manan.goel(at)oracle.com
    Wikipedia defines strategy as a plan of action designed to achieve a particular goal. An example of this is General Electric's acquisitions and divestiture strategy (plan) designed to propel GE to number 1 or 2 place (goal) in every business segment that it operated in. Execution on the other hand can be defined as the actions taken to getting things done. In GE's case execution will be steps followed for mergers/acquisitions or divestiture. Business press has written extensively about the importance of both strategy and execution in achieving desired business objectives. Perhaps the quote from Thomas Edison says it best - "vision without execution is hallucination". Conversely, it can be said that "execution without vision" is well may be "wishful thinking". Research overwhelmingly point towards the wide gap between strategy and execution. According to a published study, 49% of surveyed executives perceive a gap between their organizations' ability to develop and communicate sound strategies and their ability to implement those strategies. Further, of these respondents, 64% don't have full confidence that their companies will be able to close the gap. Having established the severity and importance of the problem let's talk about the reasons for the strategy-execution gap. The common reasons include: -        Lack of clearly defined goals -        Lack of consistent measure of success -        Lack of ownership -        Lack of alignment -        Lack of communication -        Lack of proper execution -        Lack of monitoring       There are multiple approaches to solving the problem including organizational development practices, technology enablement etc. In most cases a combination of approaches is required to achieve the desired result. For the purposes of this discussion, I'll focus on technology.  Imagine an integrated closed loop technology platform that automates the entire management cycle from defining strategy to assigning ownership to communicating goals to achieving alignment to collaboration to taking actions to monitoring progress and achieving mid course corrections. Besides, for best ROI and lowest TCO such a system should also have characteristics like:  Complete -        Full functionality -        Rich end user access Open -        Any data source -        Any business application -        Any technology stack  Integrated -        Common metadata -        Common security -        Common system management From a capabilities perspective the system should provide the following capabilities: Define -        Strategy -        Objectives -        Ownership -        KPI's Communicate -        Pervasive -        Collaborative -        Role based -        Secure Execute -        Integrated -        Intuitive -        Secure -        Ubiquitous Monitor -        Multiple styles and formats -        Exception based -        Push & Pull Having talked about the business problem and outlined the blueprint for a technology solution, let's talk about how Oracle Business Intelligence 11g can help. Oracle Business Intelligence is a comprehensive business intelligence solution for reporting, ad hoc query and analysis, OLAP, dashboards and scorecards. Oracle's best in class BI platform is based on an architecturally integrated technology foundation that provides a unified end user experience and features a Common Enterprise Information Model, with common security, query request generation and optimization, and system management. The BI platform is ·         Complete - meaning it delivers all modes and styles of BI including reporting, ad hoc query and analysis, OLAP, dashboards and scorecards with a rich end user experience that includes visualization, collaboration, alerts and notifications, search and mobile access. ·         Open - meaning the BI platform integrates with any data source, ETL tool, business application, application server, security infrastructure, portal technology as well as any ODBC compliant third party analytical tool. The suite accesses data from multiple heterogeneous sources--including popular relational and multidimensional data sources and major ERP and CRM applications from Oracle and SAP. ·         Integrated - meaning the BI platform is based on an architecturally integrated technology foundation built on an open, standards based service oriented architecture.  The platform features a common enterprise information model, common security model and a common configuration, deployment and systems management framework. To summarize, Oracle Business Intelligence is a comprehensive, integrated BI platform that lets you define strategy, identify objectives, assign ownership, define KPI's, collaborate, take action, monitor, report and do course corrections all form a single interface and a single system. The platform's integrated metadata model and task based design ensures that the entire workflow from defining strategy to execution to monitoring is completely integrated delivering end to end visibility, transparency and agility. Click here to learn more about Oracle BI 11g. 

    Read the article

  • Marketing texts for freelance programmers [closed]

    - by chiborg
    I'm a freelance developer and would like to set up a website that describes my services. When trying to come up with texts for the web site I got a severe case of writers block. I know that I'd like to describe what I do (websites, CMS, web-based applications), the different stages of projects (analysis, contract, prototype, testing, improvement, delivery, payment, etc) and who the target audience is (owners of small to medium businesses). But I have this feeling that there are some rules/tips on how to write such texts and I don't know them - any pointers?

    Read the article

  • Gauging Maturity of your BPM Strategy - part 2 / 2

    - by Sanjeev Sharma
    In my earlier post I had discussed the essence of maturity assessment and the business imperative for doing the same in the context of BPM. In this post I will discuss Oracle’s BPM Maturity assessment methodology. Oracle’s BPM Maturity model comprises of the following components: Maturity – represents stages of evolution of your BPM capability with 0 being the lowest level and 5 being the highest level  Domain – represents multiple perspectives both technical and business oriented against which your BPM capability can be assessed Adoption – represents scale of BPM rollout starting at the project level to the enterprise level Note: Your BPM capability can be at different levels of maturity for the different domains. Oracle’s BPM assessment methodology measures the maturity of your BPM capability at the individual domain level as well as the aggregate level. The output of Oracle’s BPM assessment benefits you in two ways: Gap Analysis by comparing the “As-Is” BPM capability with the desired “To-Be” BPM capability along the various domains  (see Figure 1) Systematic Adoption by aligning evolution of BPM capability with its rollout in multiple phases (see Figure 2)

    Read the article

  • How to fix “Unit Test Runner failed to load test assembly”

    - by ybbest
    I encountered this issue a couple times during my recent project, every time I forgot what actually cause the issue. Therefore, I decide to write a quick blog post to make sure I can identify the issue quickly. Problem: Run unit test using a test runner and received a Unit Test Runner failed to load test assembly exception. Analysis: Basically, I have changed some code and start the test runner to run tests. The same dll have already been deployed to GAC. So the test runner actually tries to use the old version of the assembly thus could not load the assembly. Solution: Deploy the current version of dll to the GAC and re-run your test, it works like a charm.

    Read the article

  • Columnstore Case Study #2: Columnstore faster than SSAS Cube at DevCon Security

    - by aspiringgeek
    Preamble This is the second in a series of posts documenting big wins encountered using columnstore indexes in SQL Server 2012 & 2014.  Many of these can be found in my big deck along with details such as internals, best practices, caveats, etc.  The purpose of sharing the case studies in this context is to provide an easy-to-consume quick-reference alternative. See also Columnstore Case Study #1: MSIT SONAR Aggregations Why Columnstore? As stated previously, If we’re looking for a subset of columns from one or a few rows, given the right indexes, SQL Server can do a superlative job of providing an answer. If we’re asking a question which by design needs to hit lots of rows—DW, reporting, aggregations, grouping, scans, etc., SQL Server has never had a good mechanism—until columnstore. Columnstore indexes were introduced in SQL Server 2012. However, they're still largely unknown. Some adoption blockers existed; yet columnstore was nonetheless a game changer for many apps.  In SQL Server 2014, potential blockers have been largely removed & they're going to profoundly change the way we interact with our data.  The purpose of this series is to share the performance benefits of columnstore & documenting columnstore is a compelling reason to upgrade to SQL Server 2014. The Customer DevCon Security provides home & business security services & has been in business for 135 years. I met DevCon personnel while speaking to the Utah County SQL User Group on 20 February 2012. (Thanks to TJ Belt (b|@tjaybelt) & Ben Miller (b|@DBADuck) for the invitation which serendipitously coincided with the height of ski season.) The App: DevCon Security Reporting: Optimized & Ad Hoc Queries DevCon users interrogate a SQL Server 2012 Analysis Services cube via SSRS. In addition, the SQL Server 2012 relational back end is the target of ad hoc queries; this DW back end is refreshed nightly during a brief maintenance window via conventional table partition switching. SSRS, SSAS, & MDX Conventional relational structures were unable to provide adequate performance for user interaction for the SSRS reports. An SSAS solution was implemented requiring personnel to ramp up technically, including learning enough MDX to satisfy requirements. Ad Hoc Queries Even though the fact table is relatively small—only 22 million rows & 33GB—the table was a typical DW table in terms of its width: 137 columns, any of which could be the target of ad hoc interrogation. As is common in DW reporting scenarios such as this, it is often nearly to optimize for such queries using conventional indexing. DevCon DBAs & developers attended PASS 2012 & were introduced to the marvels of columnstore in a session presented by Klaus Aschenbrenner (b|@Aschenbrenner) The Details Classic vs. columnstore before-&-after metrics are impressive. Scenario Conventional Structures Columnstore ? SSRS via SSAS 10 - 12 seconds 1 second >10x Ad Hoc 5-7 minutes (300 - 420 seconds) 1 - 2 seconds >100x Here are two charts characterizing this data graphically.  The first is a linear representation of Report Duration (in seconds) for Conventional Structures vs. Columnstore Indexes.  As is so often the case when we chart such significant deltas, the linear scale doesn’t expose some the dramatically improved values corresponding to the columnstore metrics.  Just to make it fair here’s the same data represented logarithmically; yet even here the values corresponding to 1 –2 seconds aren’t visible.  The Wins Performance: Even prior to columnstore implementation, at 10 - 12 seconds canned report performance against the SSAS cube was tolerable. Yet the 1 second performance afterward is clearly better. As significant as that is, imagine the user experience re: ad hoc interrogation. The difference between several minutes vs. one or two seconds is a game changer, literally changing the way users interact with their data—no mental context switching, no wondering when the results will appear, no preoccupation with the spinning mind-numbing hurry-up-&-wait indicators.  As we’ve commonly found elsewhere, columnstore indexes here provided performance improvements of one, two, or more orders of magnitude. Simplified Infrastructure: Because in this case a nonclustered columnstore index on a conventional DW table was faster than an Analysis Services cube, the entire SSAS infrastructure was rendered superfluous & was retired. PASS Rocks: Once again, the value of attending PASS is proven out. The trip to Charlotte combined with eager & enquiring minds let directly to this success story. Find out more about the next PASS Summit here, hosted this year in Seattle on November 4 - 7, 2014. DevCon BI Team Lead Nathan Allan provided this unsolicited feedback: “What we found was pretty awesome. It has been a game changer for us in terms of the flexibility we can offer people that would like to get to the data in different ways.” Summary For DW, reports, & other BI workloads, columnstore often provides significant performance enhancements relative to conventional indexing.  I have documented here, the second in a series of reports on columnstore implementations, results from DevCon Security, a live customer production app for which performance increased by factors of from 10x to 100x for all report queries, including canned queries as well as reducing time for results for ad hoc queries from 5 - 7 minutes to 1 - 2 seconds. As a result of columnstore performance, the customer retired their SSAS infrastructure. I invite you to consider leveraging columnstore in your own environment. Let me know if you have any questions.

    Read the article

  • TFS vs. Star Team comparison

    - by ryanabr
    I have a sales call today in which the person that I am talking to is interested in what TFS would give them over Star Team, The first thing I believe that I can say is that TFS is cheaper! Especially if you are doing MSFT development already and your team members have MSDN subscriptions as the CALs for TFS are covered in the MSDN subscription. The other thing that I noticed about Star Team was all of the references to ‘readiness’ and ‘integration’. While that is great, that means that other tools will be needed to provide the features that are already bundled with TFS like, SharePoint integration, as well as Analysis Services and Reporting Services to provide visibility on the web with reports on project health, and team velocity. Below is a quick table that I was able to throw together to answer some high level questions: Feature TFS Star Team Work Items X X Work Item custom Queries X X Customizable Work Items X Web Portal View X X Reporting X Integration Version Control X X Build Management X Integration Integrated Test Suite X Integration Cost Free for first 5 / MSDN Sub covers others $7500 / seat

    Read the article

  • Did Blowing Into Nintendo Cartridges Really Help?

    - by Jason Fitzpatrick
    Anyone old enough to remember playing cartridge-based games like those that came with the Nintendo Entertainment System or its successors certainly remembers how blowing across the cartridge opening always seemed to help a stubborn game load–but did blowing on them really help? Mental Floss shares the results of their fact finding mission, a mission that included researching the connection mechanism in the NES, talking to Frank Viturello (who conducted an informal study on the effects of moisture on cartridge connectors), and otherwise delving into the history of the phenomenon. The most interesting part of the analysis, by far, is their explanation of how blowing on the cartridge didn’t do anything but the ritual of removing the cartridge to blow on it did. Hit up the link below for the full story. Did Blowing into Nintendo Cartridges Really Help? [Mental Floss] How Hackers Can Disguise Malicious Programs With Fake File Extensions Can Dust Actually Damage My Computer? What To Do If You Get a Virus on Your Computer

    Read the article

  • Analysing Group & Individual Member Performance -RUP

    - by user23871
    I am writing a report which requires the analysis of performance of each individual team member. This is for a software development project developed using the Unified Process (UP). I was just wondering if there are any existing group & individual appraisal metrics used so I don't have to reinvent the wheel... EDIT This is by no means correct but something like: Individual Contribution (IC) = time spent (individual) / time spent (total) = Performance = ? (should use individual contribution (IC) combined with something to gain a measure of overall performance).... Maybe I am talking complete hash and I know generally its really difficult to analyse performance with numbers but any mathematicians out there that can lend a hand or know a somewhat more accurate method of analysing performance than arbitrary marking (e.g. 8 out 10)

    Read the article

  • CircuitLab Offers Easy Circuit Building on the Web and iPad

    - by Jason Fitzpatrick
    If you like to sketch out your circuit designs rapidly, cleanly, and on the web or your iPad, CircuitLab makes it dead simple. The free tool includes an easy drag-and-drop interface, circuit analysis, easy printing, and more. Watch the video above to see the creators of CircuitLab whipping up a simple circuit to showcase the app, then hit up the link below to try it out. CircuitLab [via Hacked Gadgets] How to Make Your Laptop Choose a Wired Connection Instead of Wireless HTG Explains: What Is Two-Factor Authentication and Should I Be Using It? HTG Explains: What Is Windows RT and What Does It Mean To Me?

    Read the article

  • Sample size and statistical significance in Google Analytics

    - by colmcq
    I have been asked to compile a report into dropout rates during checkout for a global webstore I have used a sample size over one month as my sample because: google analytics slows to a crawl over larger sample sizes and makes much of the analysis agonisingly small I believe it to be statistically significant and a representative sample My client has asked me why I didn't use yearly figures and wants proof that one month of data is 'statistically significant'. Am I right in thinking that I need to compare the standard deviation of my monthly sample to the yearly sample and ensure that the deviation is under a certain %age? Question: how do I prove one month of Google Analytics data is representative to one year worth of data? Stats: 90k unique views/month ~1.1m per year.

    Read the article

  • How to Banish Duplicate Photos with VisiPic

    - by Jason Fitzpatrick
    You meant well, you intended to be a good file custodian, but somewhere along the way things got out of hand and you’ve got duplicate photos galore. Don’t be afraid to delete them and lose important photos, read on as we show you how to clean safely. Deleting duplicate files, especially important ones like personal photos, makes a lot of people quite anxious (and rightfully so). Nobody wants to be the one to realize that they deleted all the photos of their child’s first birthday party during a hard drive purge gone wrong. In this tutorial we’re going to show you how to go beyond the limited reach of  tools which simply compare file names and file sizes. Instead we’ll be using a program that combines that kind of comparison with actual image analysis to help you weed out not just perfect 1:1 file duplicates but also those piles of resized for email images, cropped images, and other modified images that might be cluttering up your hard drive. How to Banish Duplicate Photos with VisiPic How to Make Your Laptop Choose a Wired Connection Instead of Wireless HTG Explains: What Is Two-Factor Authentication and Should I Be Using It?

    Read the article

  • Good library for search text tokenization

    - by Chris Dutrow
    Looking to tokenize some text in the same or similar way in which a search engine would do it. The reason we are doing this is so that we can run some statistical analysis on the tokens. The language we are using is python, so would prefer a library in that language, but could probably set something up to use another language if necessary. Example Original token: We have some great burritos! More simplified: (remove plurals and punctuation) We have some great burrito Even more simplified: (remove superfluous words) great burrito Best: (recognize positive and negative meaning): burrito -positive-

    Read the article

  • Google I/O 2012 - Big Data: Turning Your Data Problem Into a Competitive Advantage

    Google I/O 2012 - Big Data: Turning Your Data Problem Into a Competitive Advantage Ju-kay Kwek, Navneet Joneja Can businesses get practical value from web-scale data without building proprietary web-scale infrastructure? This session will explore how new Google data services can be used to solve key data storage, transformation and analysis challenges. We will look at concrete case studies demonstrating how real life businesses have successfully used these solutions to turn data into a competitive business asset. For all I/O 2012 sessions, go to developers.google.com From: GoogleDevelopers Views: 1 0 ratings Time: 52:39 More in Science & Technology

    Read the article

  • How can I sell a legacy program rewrite to the business?

    - by Wil
    We have a legacy classic ASP application that's been around since 2001. It badly needs to be re-written, but it's working fine from an end user perspective. The reason I feel like a rewrite is necessary is that when we need to update it (which is admittedly not that often) then it takes forever to go through all the spaghetti code and fix problems. Also, adding new features is also a pain since it was architect-ed and coded badly. I've run cost analysis for them on maintenance but they are willing to spend more for the small maintenance jobs than a rewrite. Any suggestions on convincing them otherwise?

    Read the article

  • Real Time BI in the Real World

    - by tobin.gilman(at)oracle.com
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";} Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif"; mso-fareast-font-family:"Times New Roman";} One of my favorite BI offerings from Oracle is a solution called Oracle Real Time Decisions.  Whenever I mention this product in customer meetings, eyes light up.  There are some fascinating examples of customers using it to up-sell, cross-sell, increase customer retention, and reduce risk in real time, with off the charts return on investment. I plan to share some of those stories in a future blog.  In this post however, I want to share some far more common real time analytics use case scenarios that are being addressed with widely deployed Oracle BI and data integration technologies Not all real time BI applications require continuous learning, predictive modeling, and data mining.  Many simply require the ability to integrate, aggregate, and access information that is current (typically within in few minutes or a few seconds).  The use cases are infinite.  A few I've seen: ·         Purchasing agents need to match demand against available inventory ·         Manufacturing planners need to monitor current parts and material against scheduled build plans ·         Airline agents need to match ticket demand against flight schedules, ·         Human resources managers need to track the status of global hiring requisitions against current headcount authorizations...you get the idea. One way of doing this is to run reports or federated queries directly against transactional systems.  That approach can be viable if you only need to access simple data sets on rare occasions.  High volume and complex queries can quickly bog down performance of mission critical transactional systems.  There is an architecturally simple way of solving the problem, and it's being applied by real companies around the world to solve real needs in real time.    Cbeyond is an Atlanta, GA based  provider of voice, data and mobile business applications delivers.  They deliver real time information to its call center agents  as they are interacting with their customers. The data they need resides in production CRM and other transactional systems, but  instead or reporting directly off the those systems, data is first moved to an operational data store (ODS).  Rather than running data intensive, time consuming, and performance degrading batch ETL routines to populate the ODS, Cbeyond uses Oracle Golden Gate software to incrementally capture and move only the changed records from log files of the transactional systems every few minutes.  There is no impact on transactional system performance, and the information needed by call center representatives is up to date.  Oracle Business Intelligence software presents the information to services reps in a rich, visual, and highly interactive format. Avea is similar to Cbeyond.  They are a telecommunications company who integrates billing and customer information in an ODS that is accessed by their call center agents in real time using Oracle Golden Gate and Oracle Business Intelligence.  They've taken it a step further by using the ODS to feed a data warehouse.  The operational data store provides the current information needed by call center agents during "in flight" customer interactions.  The data warehouse is used for more sophisticated analysis of historical data.  For maximum performance, both the ODS and data warehouse run on the Oracle Exadata Database Machine. These are practical illustrations of companies addressing real time reporting and analysis needs using established business intelligence/data warehousing methodologies and tools common to many IT departments.  If real time BI could benefit your organization, you may be already be closer than you thought to having the pieces in place to solving the problem.    Give us a shout if you are interested in learning more or if you have an interesting use or approach to real-time BI.

    Read the article

  • UppercuT v1.0 and 1.1&ndash;Linux (Mono), Multi-targeting, SemVer, Nitriq and Obfuscation, oh my!

    - by Robz / Fervent Coder
    Recently UppercuT (UC) quietly released version 1 (in August). I’m pretty happy with where we are, although I think it’s a few months later than I originally planned. I’m glad I held it back, it gave me some more time to think about some things a little more and also the opportunity to receive a patch for running builds with UC on Linux. We also released v1.1 very recently (December). UppercuT v1 Builds On Linux Perhaps the most significant changes to UC going v1 is that it now supports builds on Linux using Mono! This is thanks mostly to Svein Ackenhausen for the patches and working with me on getting it all working while not breaking the windows builds!  This means you can use mono on Windows or Linux. Notice the shell files to execute with Linux that come as part of UC now. Multi-Targeting Perhaps one of the hardest things to do that requires an automated build is multi-targeting. At v1 this is early, and possibly prone to some issues, but available.  We believe in making everything stupid simple, so it’s as simple as adding a comma to the microsoft.framework property. i.e. “net-3.5, net-4.0” to suddenly produce both framework builds. When you build, this is what you get (if you meet each framework’s requirements): At this time you have to let UC override the build location (as it does by default) or this will not work.  Semantic Versioning By now many of you have been using UppercuT for awhile and have watched how we have done versioning. Many of you who use git already know we put the revision hash in the informational/product version as the last octet. At v1, UppercuT has adopted the semantic versioning scheme. What does that mean? This is a short read, but a good one: http://SemVer.org SemVer (Semantic Versioning) is really using versioning what it was meant for. You have three octets. Major.Minor.Patch as in 1.1.0.  UC will use three different versioning concepts, one for the assembly version, one for the file version, and one for the product version. All versions - The first three octects of the version are owned by SemVer. Major.Minor.Patch i.e.: 1.1.0 Assembly Version - The assembly version would much closer follow SemVer. Last digit is always 0. Major.Minor.Patch.0 i.e: 1.1.0.0 File Version - The file version occupies the build number as the last digit. Major.Minor.Patch.Build i.e.: 1.1.0.2650 Product/Informational Version - The last octect of your product/informational version is the source control revision/hash. Major.Minor.Patch.RevisionOrHash i.e. (TFS/SVN): 1.1.0.235 i.e. (Git/HG): 1.1.0.a45ace4346adef0 SemVer is not on by default, the passive versioning scheme is still in effect. Notice that version.use_semanticversioning has been added to the UppercuT.config file (and version.patch in support of the third octet): Gems Support Gems support was added at v1. This will probably be deprecated as some point once there is an announced sunset for Nu v1. Application gems may keep it around since there is no alternative for that yet though (CoApp would be a possible replacement). Nitriq Support Nitriq is a code analysis tool like NDepend. It’s built by Mr. Jon von Gillern. It uses LINQ query language, so you can use a familiar idiom when analyzing your code base. It’s a pretty awesome tool that has a free version for those looking to do code analysis! To use Nitriq with UC, you are going to need the console edition.  To take advantage of Nitriq, you just need to update the location of Nitriq in the config: Then add the nitriq project files at the root of your source. Please refer to the Nitriq documentation on how these are created. UppercuT v1.1 Obfuscation One thing I started looking into was an easy way to obfuscate my code. I came across EazFuscator, which is both free and awesome. Plus the GUI for it is super simple to use. How do you make obfuscation even easier? Make it a convention and a configurable property in the UC config file! And the code gets obfuscated! Closing Definitely get out and look at the new release. It contains lots of chocolaty (sp?) goodness. And remember, the upgrade path is almost as simple as drag and drop!

    Read the article

  • Whats the thing the report bugs in php?

    - by Max Hazard
    Currently I am learning php. Php is understood by browser itself right from php sdk right? SDK include libraries right? So browser is like an interpreter of php codes. I want to know that whenever I type a wrong php syntax what is the thing report me the error? Obviously the browser is reporting the error. But what part of it? I mean I don't get it. Like writing a compiler we do lexical analysis and make the compiler which report any bug in source code. I assume here browser is analogous to compiler. I don't know exactly but compiler contains bug report functions or methods which is debugger. Debugger is part of compiler which report bugs. Does the browser contains such debuggers? Can there be any browser which doesn't understand php?

    Read the article

  • The Enterprise Architect (EA) diary - day 22 (from business processes to implemented applications)

    - by nattYGUR
    After spending time on keeping our repository up to date (add new ETRM application and related data flows as well as changing databases to DB clusters), collecting more data for the root cause analysis and spending time for writing proposal to creating new software infrastructure team ( that will help us to clean the table from a pile of problems that just keep on growing due to BAU control over IT dev team resources). I spend time to adapt our EA tool to support a diagram flow from high level business processes to implementation of new applications that will better support the business process. http://www.theeagroup.net/ea/Default.aspx?tabid=1&newsType=ArticleView&articleId=195

    Read the article

  • How an LED-lit LCD Monitor Works [Video]

    - by Jason Fitzpatrick
    There’s a good chance you’re staring at one right now, the common LCD monitor. How exactly does it work? Find out by watching this informative video. Bill Hammack, the engineer behind the Engineer Guy video series, takes apart an LCD monitor and gives a detailed analysis of what’s going on inside as he rebuilds it–including how the pixels function, what the screen is constructed off, and how the light is diffused. LCD Monitor Teardown [YouTube via Hack A Day] HTG Explains: What’s the Difference Between the Windows 7 HomeGroups and XP-style Networking?Internet Explorer 9 Released: Here’s What You Need To KnowHTG Explains: How Does Email Work?

    Read the article

  • Alkan Improves Aeronautical-Equipment Product Collaboration, Design Processes, and Government Compliance

    - by Gerald Fauteux
    Alkan S.A. a leading aeronautical equipment manufacturer in France, specializing in carriage-release and ejection systems for various types of military aircraft utilize Oracle’s AutoVue Electro-Mechanical Professional for Agile as part of its Agile Product Lifecycle Management solution. AutoVue Electro-Mechanical Professional for Agile enables multiformat 3-D viewing of engineering designs, leading to deeper analysis of component and product functionality and allows all teams to easily participate and contribute to product data early in the development cycle. Alkan S.A.’s equipment is used in more than 65 countries and is certified for more than 60 types of aircraft, worldwide. Click here to read the complete story. French version.

    Read the article

  • How come verification does not include actual testing?

    - by user970696
    Having read a lot about this topic, I still did not get it. Verification should prove that you are building the product right, while validation you build the right product. But only static techniques are mentioned as being verification methods (code reviews, requirements checks...). But how can you say if its implemented correctly if you do not test it? It is said that verification checks e.g. code for its correctnes. Verification - ensure that the product meet specified requirements. Again, if the function is specified to work somehow, only by testing I can say that it does. Could anyone explain this to me please? EDIT: As Wiki says: Verification:Preparing of the test cases (based on the analysis of the requireemnts) Validation: Running of the test cases

    Read the article

  • Formalizing programmers errors

    - by Maksee
    Every one of us make errors leading to bugs. Once I wanted to start logging my errors for future analysis, probably mentioning project title, approximate time spent and the most important, the type of error. For example when I copy-pasted a fragment about 'x' and replaced every occurrence of 'x' with 'y' and forgot to replace a tiny piece, this goes to 'copy-paste error'. The usefulness of this approach depends on whether I can formalize my errors at all and probably minimizing the number of types to choose from. Otherwise I would start postponing, ignoring and so on so make this system useless. Are there existing research in this area, probably a known minimum set of errors? Maybe some of you already tried to implement something like this and succeeded/failed?

    Read the article

  • AJI Software is now a Microsoft Gold Application Lifecycle Management (ALM) Partner

    - by Jeff Julian
    Our team at AJI Software has been hard at work over the past year on certifications and projects that has allowed us to reach Gold Partner status in the Microsoft Partner Program.  We have focused on providing services that not only assist in custom software development, but process analysis and mentoring.  I definitely want to thank each one of our team members for all their work.  We are currently the only Microsoft Gold ALM Partner for a 500 mile radius around Kansas City. If you or your team is in need of assistance with Team Foundation Server, Agile Processes, Scrum Mentoring, or just a process/team assessment, please feel free to give us a call.  We also have practices focused on SharePoint, Mobile development (iOS, Android, Windows Mobile), and custom software development with .NET.  Technorati Tags: Gold Partner,ALM,Scrum,TFS,AJI Software

    Read the article

< Previous Page | 58 59 60 61 62 63 64 65 66 67 68 69  | Next Page >