Search Results

Search found 25579 results on 1024 pages for 'complex event processing'.

Page 689/1024 | < Previous Page | 685 686 687 688 689 690 691 692 693 694 695 696  | Next Page >

  • Load and Web Performance Testing using Visual Studio Ultimate 2010-Part 2

    - by Tarun Arora
    Welcome back, in part 1 of Load and Web Performance Testing using Visual Studio 2010 I talked about why Performance Testing the application is important, the test tools available in Visual Studio Ultimate 2010 and various test rig topologies. In this blog post I’ll get into the details of web performance & load tests as well as why it’s important to follow a goal based pattern while performance testing your application. Tools => Options => Test Tools Have you visited the treasures of Visual Studio Menu bar tools => Options => Test Tools lately? The options to enable disable prompts on creating, editing, deleting or running manual/automated tests can be controller from here. The default test project language and default test types created on a new test project creation could be selected/unselected from here. Ever wondered how you can change the default limit of 25 test results, this can again be changed from here. If you record a lot of Web Tests and wish for the web test recorder to start with “that” URL populated, well this again can be specified from here. If you haven’t so far, I would urge you to spend 2 minutes in the test tools options.   Test Menu => Ready Steady Test Action! The Test tools are under the Test Menu in Visual Studio, apart from being able to create a new Test and Test List you can also load an existing vsmdi file. You can also manage your test controllers from here. A solution can have one or more test setting files, but there can only be one active test settings file at any time. Again, this selection can be done from here.  You can open the various test windows from under the windows option from the test menu. If you open the Test view window you will see that you have the option to group the tests by work items, project, test type, etc. You can set these properties by right clicking a test in the test list and choosing properties from the context menu.    So, what is a vsmdi file? vsmdi stands for Visual Studio Test Metadata File. Placed under the Solution Items this file keeps track of the list of unit tests in your solution. If you open the vsmdi file as an xml file you will see a series of Test Links nested with in the list Test List tags along with the Run Configuration tag. When in visual studio you run tests, the IDE looks at the vsmdi file to see what tests need to be run. You also have the option of using the vsmdi file in your team builds to specify which tests need to run as part of the build. Refer here for a walkthrough from a fellow blogger on how to use the vsmdi file in the team builds. Web Performance Test – The Truth! In Visual Studio 2010 “Web Tests” have been renamed to “Web Performance Tests”. Apart from renaming this test type there have been several improvements to this test type in visual studio 2010. I am very active on the MSDN Visual Studio And Load Testing forum and a frequent question from many users is “Do Web Tests support Pages that run JavaScript?” I will start with a little bit of background before answering this question. Web Performance Tests operate at the HTTP Layer, but why? To enable you to generate high loads with a relatively low amount of hardware, Web performance tests are driven at the protocol layer rather than instantiating a browser.The most common source of confusion is that users do not realize Web Performance Tests work at the HTTP layer. The tool adds to that misconception. After all, you record in IE, and when running a Web test you can select which browser to use, and then the result viewer shows the results in a browser window. So that means the tests run through the browser, right? NO! The Web test engine works at the HTTP layer, and does not instantiate a browser. What does that mean? In the diagram below, you can see there are no browsers running when the engine is sending and receiving requests. Does that mean I can’t test pages that use Java script? The best example for java script generating HTTP traffic is AJAX calls. The most common example of browser plugins are Silverlight or Flash. The Web test recorder will record HTTP traffic from AJAX calls and from most (but not all) browser plugins. This means you will still be able to web performance test pages that use java script or plugin and play back the results but the playback engine will not show the java script or plug in results in the ‘browser control’. If you want to test the page behaviour as a result of the java script or plug in consider using Coded UI Tests. This page looks like it failed, when in fact it succeeded! Looking closely at the response, and subsequent requests, it is clear the operation succeeded. As stated above, the reason why the browser control is pasting this message is because java script has been disabled in this control. So, to reiterate, the web performance test recorder: - Sends and receives data at the HTTP layer. - Does NOT run a browser. - Does NOT run java script. - Does NOT host ActiveX controls or plugins. There is a great series of blog posts from Ed Glas, i would highly recommend his blog to any one performing Load/Performance testing through Visual Studio. Demo – Web Performance Test [Demo] - Visual Studio Ultimate 2010: Test Settings and Configuration   [Demo]–Visual Studio Ultimate 2010: Web Performance Test   In this short video I try and answer the following questions, Why is performance Testing important? How does Visual Studio Help you performance Test your applications? How do i record a web performance test? How do make a web performance test data driven, transaction driven, loop driven, convert to code, add validations? Best practices for recording Web Performance Tests. I have a web performance test, what next? Creating the Web Performance Test was the first step towards load testing your application. Now that we have the base test we can test the page behaviour when N-users access the page. Have you ever had the head of business call you and mention that the marketing team has done a fantastic job and are expecting increased traffic on the web site, can the website survive the weekend with that additional load? This is the perfect opportunity to capacity test your application to see how your website holds up under various levels of load, you can work the results backwards to see how much hardware you may need to scale up your application to survive the weekend. Apart from that it is always a good idea to have some benchmarks around how the application performs under light loads for short duration, under heavy load for long duration and soak test the application run a constant load for a very week or two to record the effects of constant load for really long durations, this is a great way of identifying how your application handles the default IIS application pool reset which by default is configured to once every 25 hours. These bench marks will act as the perfect yard stick to measure performance gains when you start making improvements. BUT there are some best practices! => Goal Based Load Testing Approach Since the subject is vast and there are a lot of things to measure and analyse, … it is very easy to get distracted from the real goal!  You can optimize your application once you know where the pain points are. There is no point performing a load test of 5000 users if your intranet application will only have a 100 simultaneous users, it is important to keep focussed on the real goals of the project. So the idea is to have a user story around your load testing scenarios and test realistically. So it is recommended that you follow the below outline, It is an Iterative process, refine your objectives, identify the key scenarios, what is the expected workload, key metrics you want to report, record the web performance tests, simulate load and analyse results. Is your application already deployed in Production? This is great! You can analyse the IIS Logs to understand the user behaviour… But what are IIS LOGS? The IIS logs allow you to record events for each application and Web site on the Web server. You can create separate logs for each of your applications and Web sites. Logging information in IIS goes beyond the scope of the event logging or performance monitoring features provided by Windows. The IIS logs can include information, such as who has visited your site, what the visitor viewed, and when the information was last viewed. You can use the IIS logs to identify any attempts to gain unauthorized access to your Web server. How to configure IIS LOGS? For those Ninjas who already have IIS Logs configured (by the way its on by default) and need a way to analyse the IIS Logs, can use the Windows IIS Utility – Log Parser. Log Parser is a very powerful tool that provides a generic SQL-like language on top of many types of data like IIS Logs, Event Viewer entries, XML files, CSV files, File System and others; and it allows you to export the result of the queries to many output formats such as CSV, XML, SQL Server, Charts and others; and it works well with IIS 5, 6, 7 and 7.5. Frequently used Log Parser queries. Demo – Load Test [Demo]–Visual Studio Ultimate 2010: Load Testing   In this short video I try and answer the following questions, - Types of Performance Testing? - Perform Goal driven Load Testing, analyse Test Run Result and Generate a report? Recap A quick recap of what we have covered so far,     Thank you for taking the time out and reading this blog post, in part III of this blog series I’ll be getting into the details of Test Result Analysis, Test Result Drill through, Test Report Generation, Test Run Comparison, and the Asp.net Profiler. If you enjoyed the post, remember to subscribe to http://feeds.feedburner.com/TarunArora. Questions/Feedback/Suggestions, etc please leave a comment. See you on in Part III   Share this post : CodeProject

    Read the article

  • Win a Free License for Windows 7 Ultimate or Silverlight Spy at Our West Palm Beach .Net User Group

    - by Sam Abraham
    Shervin Shakibi, Microsoft Regional Director, ASP.Net MVP and Microsoft Certified Trainer will be our speaker at our West Palm Beach .Net User Group May meeting,  Shervin founded the FlaDotNet Users Group Network to which our West Palm Beach .Net User Group belongs. Shervin will be talking to us about the new features of Silverlight 4.0. I am personally looking forward to attending this event as I have always found Shervin's talks fun and a great learning experience.   At the end of our meeting, we will be having a free raffle. We will be giving away 1 free Windows 7 Ultimate license and 2 free Silverlight Spy licenses as well as several books and other giveaways. Usually, everybody goes home with a freebie.  We will also continue having ample networking time while enjoying free pizza/soda sponsored by Sherlock Technology and SISCO Corporation who is a new sponsor of our group.   Koen Zwikstra, Silverlight MVP and Founder of First Floor Software has kindly offered the West Palm Beach .Net User Group several free licenses of Silverlight Spy to raffle during our meetings. We will start by raffling two copies during our May meeting.   Silverlight Spy is a very valuable tool in debugging Silverlight applications. It has been mentioned at MIX10 ( http://firstfloorsoftware.com/blog/silverlight-spy-at-mix10/) as well as by Microsoft Community Leaders (http://blogs.msdn.com/chkoenig/archive/2008/08/29/silverlight-spy.aspx)   I am using Silverlight Spy myself and will probably be using it to demonstrate Silverlight internals during my talks. I think Koen's gift to our group will bring great value to our fortunate members who end up winning the licenses. Thank you Koen for your kind gift and looking forward to meeting you all on May 25th 2010 6:30 PM at CompTec (http://www.fladotnet.com/Reg.aspx?EventID=462)   Sam Abraham Site Director - West Palm Beach .Net User Group

    Read the article

  • Bloggers Unite at Annual OpenWorld Blogger Meetup

    - by Bob Rhubart
    OTN is pleased to be once again be working with our friends at Pythian to sponsor the 2012 edition of the annual Blogger Meet-up, a grassroots community event that occurs during Oracle OpenWorld. What: Oracle Bloggers Meetup 2012 When: Wed, 3-Oct-2012, 5:30pm Where: Main Dining Room,Jillian’s Billiards @ Metreon 101 Fourth Street San Francisco, CA 94103 (street view). Please RSVP The meet-up was started several years ago by Oracle ACE Director Mark Rittman as a casual get-together for the community of bloggers who share a common interest in Oracle technologies to gather for shop talk and an adult beverage or two. This was the first opportunity for of many of these bloggers to meet face to face, and the gathering gained momentum. A few years in Oracle ACE Director Eddie Awad stepped in as organizer, until passing the torch in 2009 to Pythian CTO and Oracle ACE Director Alex Gorbachev , who continues to keep the flame burning.  Bloggers with expertise in any and all Oracle products and technologies are invited to attend. Please RSVP via the comments section in Alex's blog. Several blog aggregators are available that make it possible to track the activity of this community of bloggers. You'll find a list here (orafaq.com)

    Read the article

  • MYSQL backup and restore

    - by Codezy
    I am having trouble getting mysql backups to run properly when their are views in the database. I think this might have something to do with needing a placeholder object for it. In any event I run this command: mysqldump -u myuser -pmypassword mydatabase | mysql -u myuser -pmypassword -C mydatabase_Beta The user has full privileges and I get this: View mydatabase_beta.yadayada references invalid tables or columns or functions or definer/invoker or view lack rights to use them. How can I back it up so that it restores all of my database properly? In the example I am restoring it to a different name but I do need to be able to restore a working copy. I think it is probably an additional mysqldump parameter or maybe hot copy would work better. Thoughts?

    Read the article

  • Best way to fix security problems caused by windows updates?

    - by Chris Lively
    I have a laptop running Windows 7 32-bit. Last nights security updates caused my logitech mouse to stop working (specifically, it caused several USB ports to stop altogether). After reviewing the system event log I found that the IPBusEnum component was failing due to an activation security error. A little more research and I found that this was caused by the TrustedInstaller replacing the security permissions on those keys and generally mucking them up. To fix this I had to open regedit, take ownership of ALL the keys related to IPBusEnum and force it to use the inherited permissions from the tree. Is there a better way to fix this when MS screws up the updates? I would hate to have to walk around to a number of machines and manually fix the registry key security settings.

    Read the article

  • How to create managed properties at site collection level in SharePoint2013

    - by ybbest
    In SharePoint2013, you can create managed properties at site collection. Today, I’d like to show you how to do so through PowerShell. 1. Define your managed properties and crawled properties and managed property Type in an external csv file. PowerShell script will read this file and create the managed and the mapping. 2. As you can see I also defined variant Type, this is because you need the variant type to create the crawled property. In order to have the crawled properties, you need to do a full crawl and also make sure you have data populated for your custom column. However, if you do not want to a full crawl to create those crawled properties, you can create them yourself by using the PowerShell; however you need to make sure the crawled properties you created have the same name if created by a full crawl. Managed properties type: Text = 1 Integer = 2 Decimal = 3 DateTime = 4 YesNo = 5 Binary = 6 Variant Type: Text = 31 Integer = 20 Decimal = 5 DateTime = 64 YesNo = 11 3. You can use the following script to create your managed properties at site collection level, the differences for creating managed property at site collection level is to pass in the site collection id. param( [string] $siteUrl="http://SP2013/", [string] $searchAppName = "Search Service Application", $ManagedPropertiesList=(IMPORT-CSV ".\ManagedProperties.csv") ) Add-PSSnapin Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue $searchapp = $null function AppendLog { param ([string] $msg, [string] $msgColor) $currentDateTime = Get-Date $msg = $msg + " --- " + $currentDateTime if (!($logOnly -eq $True)) { # write to console Write-Host -f $msgColor $msg } # write to log file Add-Content $logFilePath $msg } $scriptPath = Split-Path $myInvocation.MyCommand.Path $logFilePath = $scriptPath + "\CreateManagedProperties_Log.txt" function CreateRefiner {param ([string] $crawledName, [string] $managedPropertyName, [Int32] $variantType, [Int32] $managedPropertyType,[System.GUID] $siteID) $cat = Get-SPEnterpriseSearchMetadataCategory –Identity SharePoint -SearchApplication $searchapp $crawledproperty = Get-SPEnterpriseSearchMetadataCrawledProperty -Name $crawledName -SearchApplication $searchapp -SiteCollection $siteID if($crawledproperty -eq $null) { Write-Host AppendLog "Creating Crawled Property for $managedPropertyName" Yellow $crawledproperty = New-SPEnterpriseSearchMetadataCrawledProperty -SearchApplication $searchapp -VariantType $variantType -SiteCollection $siteID -Category $cat -PropSet "00130329-0000-0130-c000-000000131346" -Name $crawledName -IsNameEnum $false } $managedproperty = Get-SPEnterpriseSearchMetadataManagedProperty -Identity $managedPropertyName -SearchApplication $searchapp -SiteCollection $siteID -ErrorAction SilentlyContinue if($managedproperty -eq $null) { Write-Host AppendLog "Creating Managed Property for $managedPropertyName" Yellow $managedproperty = New-SPEnterpriseSearchMetadataManagedProperty -Name $managedPropertyName -Type $managedPropertyType -SiteCollection $siteID -SearchApplication $searchapp -Queryable:$true -Retrievable:$true -FullTextQueriable:$true -RemoveDuplicates:$false -RespectPriority:$true -IncludeInMd5:$true } $mappedProperty = $crawledproperty.GetMappedManagedProperties() | ?{$_.Name -eq $managedProperty.Name } if($mappedProperty -eq $null) { Write-Host AppendLog "Creating Crawled -> Managed Property mapping for $managedPropertyName" Yellow New-SPEnterpriseSearchMetadataMapping -CrawledProperty $crawledproperty -ManagedProperty $managedproperty -SearchApplication $searchapp -SiteCollection $siteID } $mappedProperty = $crawledproperty.GetMappedManagedProperties() | ?{$_.Name -eq $managedProperty.Name } #Get-FASTSearchMetadataCrawledPropertyMapping -ManagedProperty $managedproperty } $searchapp = Get-SPEnterpriseSearchServiceApplication $searchAppName $site= Get-SPSite $siteUrl $siteId=$site.id Write-Host "Start creating Managed properties" $i = 1 FOREACH ($property in $ManagedPropertiesList) { $propertyName=$property.managedPropertyName $crawledName=$property.crawledName $managedPropertyType=$property.managedPropertyType $variantType=$property.variantType Write-Host $managedPropertyType Write-Host "Processing managed property $propertyName $($i)..." $i++ CreateRefiner $crawledName $propertyName $variantType $managedPropertyType $siteId Write-Host "Managed property created " $propertyName } Key Concepts Crawled Properties: Crawled properties are discovered by the search index service component when crawling content. Managed Properties: Properties that are part of the Search user experience, which means they are available for search results, advanced search, and so on, are managed properties. Mapping Crawled Properties to Managed Properties: To make a crawled property available for the Search experience—to make it available for Search queries and display it in Advanced Search and search results—you must map it to a managed property. References Administer search in SharePoint 2013 Preview Managing Metadata New-SPEnterpriseSearchMetadataCrawledProperty New-SPEnterpriseSearchMetadataManagedProperty Remove-SPEnterpriseSearchMetadataManagedProperty Overview of crawled and managed properties in SharePoint 2013 Preview Remove-SPEnterpriseSearchMetadataManagedProperty SharePoint 2013 – Search Service Application

    Read the article

  • Silverlight Cream for June 17, 2010 -- #885

    - by Dave Campbell
    In this Issue: Zoltan Arvai, Antoni Dol, Jeff Prosise, David Anson, and John Papa. Shoutouts: Rob Davis has a World Cup Football Stadium tour in Silverlight, Azure, and Bing Maps up: The World Cup Map... cruise around this... tons of features. The Silverlight Team Blog reports that NBC sports is streaming the US Open in Silverlight Adam Kinney announced Expression Studio 4 Launch keynote videos are available From SilverlightCream.com: Data Driven Applications with MVVM Part III: Validation, Bringing the UI Closer Zoltan Arvai's 3rd (and final) part of the Data-Driven MVVM apps is up at SilverlightShow. In this final section he is focusing on validation, and discussion of closer integration to the view. Focus on FocusVisualElement in Silverlight buttons Antoni Dol has a cool post up about the FocusVisualElement, and uses a button to demonstrate how it can be used. Dynamic XAP Discovery with Silverlight MEF Jeff Prosise is discussing Silverlight and MEF ... but better than the normal loading XAP files ... he's doing dynamic discovery of XAP files ... and makes it look easy! Updated analysis of two ways to create a full-size Popup in Silverlight David Anson revisits a prior post with an eye toward Silverlight 4. The feature he's discussing is that you can now hook the Resized event without having browser zoom disabled... and he demonstrates it's use in the code from the old post. Silverlight as a Transmedia Platform (Silverlight TV #33) Jesse Liberty joins John Papa this week with Silverlight TV #33, discussing Transmedia and Silverlight as a Transmedia Platform. Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone MIX10

    Read the article

  • SQL SERVER – Integrate Your Data with Skyvia – Cloud ETL Solution

    - by Pinal Dave
    In our days data integration often becomes a key aspect of business success. For business analysts it’s very important to get integrated data from various sources, such as relational databases, cloud CRMs, etc. to make correct and successful decisions. There are various data integration solutions on market, and today I will tell about one of them – Skyvia. Skyvia is a cloud data integration service, which allows integrating data in cloud CRMs and different relational databases. It is a completely online solution and does not require anything except for a browser. Skyvia provides powerful etl tools for data import, export, replication, and synchronization for SQL Server and other databases and cloud CRMs. You can use Skyvia data import tools to load data from various sources to SQL Server (and SQL Azure). Skyvia supports such cloud CRMs as Salesforce and Microsoft Dynamics CRM and such databases as MySQL and PostgreSQL. You even can migrate data from SQL Server to SQL Server, or from SQL Server to other databases and cloud CRMs. Additionally Skyvia supports import of CSV files, either uploaded manually or stored on cloud file storage services, such as Dropbox, Box, Google Drive, or FTP servers. When data import is not enough, Skyvia offers bidirectional data synchronization. With this tool, you can synchronize SQL Server data with other databases and cloud CRMs. After performing the first synchronization, Skyvia tracks data changes in the synchronized data storages. In SQL Server databases (and other relational databases) it creates additional tracking tables and triggers. This allows synchronizing only the changed data. Skyvia also maps records by their primary key values to each other, so it does not require different sources to have the same primary key structure. It still can match the corresponding records without having to add any additional columns or changing data structure. The only requirement for synchronization is that primary keys must be autogenerated. With Skyvia it’s not necessary for data to have the same structure in integrated data storages. Skyvia supports powerful mapping mechanisms that allow synchronizing data with completely different structure. It provides support for complex mathematical and string expressions when mapping data, using lookups, etc. You may use data splitting – loading data from a single CSV file or source table to multiple related target tables. Or you may load data from several source CSV files or tables to several related target tables. In each case Skyvia preserves data relations. It builds corresponding relations between the target data automatically. When you often work with cloud CRM data, native CRM data reporting and analysis tools may be not enough for you. And there is a vast set of professional data analysis and reporting tools available for SQL Server. With Skyvia you can quickly copy your cloud CRM data to an SQL Server database and apply corresponding SQL Server tools to the data. In such case you can use Skyvia data replication tools. It allows you to quickly copy cloud CRM data to SQL Server or other databases without customizing any mapping. You need just to specify columns to copy data from. Target database tables will be created automatically. Skyvia offers powerful filtering settings to replicate only the records you need. Skyvia also provides capability to export data from SQL Server (including SQL Azure) and other databases and cloud CRMs to CSV files. These files can be either downloadable manually or loaded to cloud file storages or FTP server. You can use export, for example, to backup SQL Azure data to Dropbox. Any data integration operation can be scheduled for automatic execution. Thus, you can automate your SQL Azure data backup or data synchronization – just configure it once, then schedule it, and benefit from automatic data integration with Skyvia. Currently registration and using Skyvia is completely free, so you can try it yourself and find out whether its data migration and integration tools suits for you. Visit this link to register on Skyvia: https://app.skyvia.com/register Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL Tagged: Cloud Computing

    Read the article

  • User profile and desktop unclickable and unviewable from windows explorer

    - by Abel
    Situation: Windows Vista, latest updates. After restarting to complete an installation, I find myself looking at a totally black windows desktop without any icons. The start menu and taskbar, including quickstart icons, appears. Some, but not all task bar tray icons appear. The systems seems stable. When I open Windows Explorer and click "desktop" in the folder treeview, the cursor immediately jumps back to the previously selected item. No error. Same when clicking on my user's profile or my documents. When I try "save as" in, say, Notepad, nothing happens, the dialog box (which defaults to "my documents") doesn't even show. Again, no error. Nothing serious afaict in the event log. Typing something Start Search shows "Search failed to initialize". Anybody ever encountered such abomination?

    Read the article

  • Progressive Enhancement vs. Single Page Apps

    - by SeanPlusPlus
    I just got back from a conference in Boston called An Event Apart. A really popular theme amongst the speakers was the idea of progressive enhancement - a site's content should go in the HTML, and JavaScript should only be used to enhance behavior. The arguments that the speakers gave for progressive enhancement were very compelling. Not only is it a solid pattern for supporting older browsers, and devices on a network with low bandwidth, but HTML fails much more gracefully than JavaScript (i.e. markup that is not supported is just ignored, while if a browser throws an exception while executing your script - you are hosed). Jeremy Keith gave a particularly insightful talk about this. But what about single page web apps like Backbone and Angular? The whole design behind these frameworks seems to push the developer toward moving content out of the HTML, and into something like a JSON API. I can not seem to gel these two design patterns: progressive enhancement vs. single page web apps. Are there instances when one is better than the other? Or are they not even antagonistic technologies, and I am missing something here with my mental model?

    Read the article

  • Oracle Text query parser

    - by Roger Ford
    Oracle Text provides a rich query syntax which enables powerful text searches.However, this syntax isn't intended for use by inexperienced end-users.  If you provide a simple search box in your application, you probably want users to be able to type "Google-like" searches into the box, and have your application convert that into something that Oracle Text understands.For example if your user types "windows nt networking" then you probably want to convert this into something like"windows ACCUM nt ACCUM networking".  But beware - "NT" is a reserved word, and needs to be escaped.  So let's escape all words:"{windows} ACCUM {nt} ACCUM {networking}".  That's fine - until you start introducing wild cards. Then you must escape only non-wildcarded searches:"win% ACCUM {nt} ACCUM {networking}".  There are quite a few other "gotchas" that you might encounter along the way.Then there's the issue of scoring.  Given a query for "oracle text query syntax", it would be nice if we could score a full phrase match higher than a hit where all four words are present but not in a phrase.  And then perhaps lower than that would be a document where three of the four terms are present.  Progressive relaxation helps you with this, but you need to code the "progression" yourself in most cases.To help with this, I've developed a query parser which will take queries in Google-like syntax, and convert them into Oracle Text queries. It's designed to be as flexible as possible, and will generate either simple queries or progressive relaxation queries. The input string will typically just be a string of words, such as "oracle text query syntax" but the grammar does allow for more complex expressions:  word : score will be improved if word exists  +word : word must exist  -word : word CANNOT exist  "phrase words" : words treated as phrase (may be preceded by + or -)  field:(expression) : find expression (which allows +,- and phrase as above) within "field". So for example if I searched for   +"oracle text" query +syntax -ctxcatThen the results would have to contain the phrase "oracle text" and the word syntax. Any documents mentioning ctxcat would be excluded from the results. All the instructions are in the top of the file (see "Downloads" at the bottom of this blog entry).  Please download the file, read the instructions, then try it out by running "parser.pls" in either SQL*Plus or SQL Developer.I am also uploading a test file "test.sql". You can run this and/or modify it to run your own tests or run against your own text index. test.sql is designed to be run from SQL*Plus and may not produce useful output in SQL Developer (or it may, I haven't tried it).I'm putting the code up here for testing and comments. I don't consider it "production ready" at this point, but would welcome feedback.  I'm particularly interested in comments such as "The instructions are unclear - I couldn't figure out how to do XXX" "It didn't work in my environment" (please provide as many details as possible) "We can't use it in our application" (why not?) "It needs to support XXX feature" "It produced an invalid query output when I fed in XXXX" Downloads: parser.pls test.sql

    Read the article

  • Ignoring Robots - Or Better Yet, Counting Them Separately

    - by [email protected]
    It is quite common to have web sessions that are undesirable from the point of view of analytics. For example, when there are either internal or external robots that check the site's health, index it or just extract information from it. These robotic session do not behave like humans and if their volume is high enough they can sway the statistics and models.One easy way to deal with these sessions is to define a partitioning variable for all the models that is a flag indicating whether the session is "Normal" or "Robot". Then all the reports and the predictions can use the "Normal" partition, while the counts and statistics for Robots are still available.In order for this to work, though, it is necessary to have two conditions:1. It is possible to identify the Robotic sessions.2. No learning happens before the identification of the session as a robot.The first point is obvious, but the second may require some explanation. While the default in RTD is to learn at the end of the session, it is possible to learn in any entry point. This is a setting for each model. There are various reasons to learn in a specific entry point, for example if there is a desire to capture exactly and precisely the data in the session at the time the event happened as opposed to including changes to the end of the session.In any case, if RTD has already learned on the session before the identification of a robot was done there is no way to retract this learning.Identifying the robotic sessions can be done through the use of rules and heuristics. For example we may use some of the following:Maintain a list of known robotic IPs or domainsDetect very long sessions, lasting more than a few hours or visiting more than 500 pagesDetect "robotic" behaviors like a methodic click on all the link of every pageDetect a session with 10 pages clicked at exactly 20 second intervalsDetect extensive non-linear navigationNow, an interesting experiment would be to use the flag above as an output of a model to see if there are more subtle characteristics of robots such that a model can be used to detect robots, even if they fall through the cracks of rules and heuristics.In any case, the basic and simple technique of partitioning the models by the type of session is simple to implement and provides a lot of advantages.

    Read the article

  • Logparser and Powershell

    - by Michel Klomp
    Logparser in powershell One of the few examples how to use logparser in powershell is from the Microsoft.com Operations blog. This script is a good base to create more advanced logparser scripts: $myQuery = new-object -com MSUtil.LogQuery $szQuery = “Select top 10 * from r:\ex07011210.log”; $recordSet = $myQuery.Execute($szQuery) for(; !$recordSet.atEnd(); $recordSet.moveNext()) {             $record=$recordSet.getRecord();             write-host ($record.GetValue(0) + “,”+ $record.GetValue(1)); } $recordSet.Close(); Logparser input formats The previous example uses the default logparser object, you can extent this with the logparser input formats. with this formats get information from the event-log, different types of logfiles, the Active Directory, the registry and XML files. Here are the different ProgId’s you can use. Input Format ProgId ADS MSUtil.LogQuery.ADSInputFormat BIN MSUtil.LogQuery.IISBINInputFormat CSV MSUtil.LogQuery.CSVInputFormat ETW MSUtil.LogQuery.ETWInputFormat EVT MSUtil.LogQuery.EventLogInputFormat FS MSUtil.LogQuery.FileSystemInputFormat HTTPERR MSUtil.LogQuery.HttpErrorInputFormat IIS MSUtil.LogQuery.IISIISInputFormat IISODBC MSUtil.LogQuery.IISODBCInputFormat IISW3C MSUtil.LogQuery.IISW3CInputFormat NCSA MSUtil.LogQuery.IISNCSAInputFormat NETMON MSUtil.LogQuery.NetMonInputFormat REG MSUtil.LogQuery.RegistryInputFormat TEXTLINE MSUtil.LogQuery.TextLineInputFormat TEXTWORD MSUtil.LogQuery.TextWordInputFormat TSV MSUtil.LogQuery.TSVInputFormat URLSCAN MSUtil.LogQuery.URLScanLogInputFormat W3C MSUtil.LogQuery.W3CInputFormat XML MSUtil.LogQuery.XMLInputFormat Using logparser to parse IIS logs if you use the IISW3CinputFormat you can use the field names instead of de row number to get the information from an IIS logfile, it also skips the comment rows in the logfile. $ObjLogparser = new-object -com MSUtil.LogQuery $objInputFormat = new-object -com MSUtil.LogQuery.IISW3CInputFormat $Query = “Select top 10 * from c:\temp\hb\ex071002.log”; $recordSet = $ObjLogparser.Execute($Query, $objInputFormat) for(; !$recordSet.atEnd(); $recordSet.moveNext()) {     $record=$recordSet.getRecord();     write-host ($record.GetValue(“s-ip”) + “,”+ $record.GetValue(“cs-uri-query”)); } $recordSet.Close();

    Read the article

  • Part 15: Fail a build based on the exit code of a console application

    In the series the following parts have been published Part 1: Introduction Part 2: Add arguments and variables Part 3: Use more complex arguments Part 4: Create your own activity Part 5: Increase AssemblyVersion Part 6: Use custom type for an argument Part 7: How is the custom assembly found Part 8: Send information to the build log Part 9: Impersonate activities (run under other credentials) Part 10: Include Version Number in the Build Number Part 11: Speed up opening my build process template Part 12: How to debug my custom activities Part 13: Get control over the Build Output Part 14: Execute a PowerShell script Part 15: Fail a build based on the exit code of a console application When you have a Console Application or a batch file that has errors, the exitcode is set to another value then 0. You would expect that the build would see this and report an error. This is not true however. First we setup the scenario. Add a ConsoleApplication project to your solution you are building. In the Main function set the ExitCode to 1     class Program    {        static void Main(string[] args)        {            Console.WriteLine("This is an error in the script.");            Environment.ExitCode = 1;        }    } Checkin the code. You can choose to include this Console Application in the build or you can decide to add the exe to source control Now modify the Build Process Template CustomTemplate.xaml Add an argument ErrornousScript Scroll down beneath the TryCatch activity called “Try Compile, Test, and Associate Changesets and Work Items” Add an Sequence activity to the template In the Sequence, add a ConvertWorkspaceItem and an InvokeProcess activity (see Part 14: Execute a PowerShell script  for more detailed steps) In the FileName property of the InvokeProcess use the ErrornousScript so the ConsoleApplication will be called. Modify the build definition and make sure that the ErrornousScript is executing the exe that is setting the ExitCode to 1. You have now setup a build definition that will execute the errornous Console Application. When you run it, you will see that the build succeeds. This is not what you want! To solve this, you can make use of the Result property on the InvokeProcess activity. So lets change our Build Process Template. Add the new variables (scoped to the sequence where you run the Console Application) called ExitCode (type = Int32) and ErrorMessage Click on the InvokeProcess activity and change the Result property to ExitCode In the Handle Standard Output of the InvokeProcess add a Sequence activity In the Sequence activity, add an Assign primitive. Set the following properties: To = ErrorMessage Value = If(Not String.IsNullOrEmpty(ErrorMessage), Environment.NewLine + ErrorMessage, "") + stdOutput And add the default BuildMessage to the sequence that outputs the stdOutput Add beneath the InvokeProcess activity and If activity with the condition ExitCode <> 0 In the Then section add a Throw activity and set the Exception property to New Exception(ErrorMessage) The complete workflow looks now like When you now check in the Build Process Template and run the build, you get the following result And that is exactly what we want.   You can download the full solution at BuildProcess.zip. It will include the sources of every part and will continue to evolve.

    Read the article

  • La pianificazione finanziaria fra le opere di Peggy Guggenheim

    - by user812481
    Lo scorso 22 giugno nella fantastica cornice del Palazzo Venier dei Leoni a Venezia si è tenuto il CFO Executive meeting & event sul Cash flow planning &Optimization. L’evento iniziato con un networking lunch ha permesso agli ospiti di godere della fantastica vista della terrazza panoramica del palazzo che affaccia su Canal Grande. Durante i lavori, Oracle e Reply Consulting, partner dell’evento, hanno parlato della strategia di corporate finance e del valore della pianificazione economico-finanziaria- patrimoniale integrata. Grazie alla partecipazione di Banca IMI si sono potuti approfondire i temi del Business Plan, Sensitivity Analysis e Covenant Test nelle operazioni di Finanza Strutturata. AITI (Associazione Italiana Tesorieri d’Impresa) ha concluso i lavori dando una visione a 360° della pianificazione finanziaria, spiegando il percorso strategico necessario per i flussi di capitale a sostegno del business. Ecco l’elenco degli interventi: Il valore della pianificazione economico-finanziaria-patrimoniale integrata per il CFO nei processi di corporate governance - Lorenzo Mariani, Partner - Reply Consulting Business Plan, Sensitivity Analysis e Covenant Test nelle operazioni di Finanza Strutturata: applicazioni nelle fasi di concessione del credito e di monitoraggio dei rischi - Gianluca Vittucci, Responsabile Finanza Strutturata Banca dei Territori - Banca IMI Dalla strategia di corporate finance al planning operativo: una visione completa ed integrata del processo di pianificazione economico-finanziario-patrimoniale - Edilio Rossi, EPM Business Development Manager, Italy - Oracle EMEA Pianificazione Finanziaria: percorso strategico per ottimizzare i flussi di capitale allo sviluppo del business Aziendale; processo base nelle relazioni con il sistema bancario - Giovanni Ceci, Consigliere AITI e Temporary Finance Manager - Associazione Italiana Tesorieri d’Impresa Per visualizzare tutte le presentazioni seguici su slideshare.  Per visualizzare tutte le foto della giornata clicca qui.

    Read the article

  • Looking for application performance tracking software

    - by JavaRocky
    I have multiple java-based applications which produce statistics on how long method calls take. Right now the information is being written into a log file and I analyse performance that way. However with multiple apps and more monitoring requirements this is being becoming a bit overwhelming. I am looking for an application which will collect stats and graph them so I can analyse performance and be aware of performance degradation. I have looked at Solarwinds Application Performance Monitoring, however this polls periodically to gather information. My applications are totally event based and we would like to graph and track this accordingly. I almost started hacking together some scripts to produce Google Charts but surely there are applications which do this already. Suggestions?

    Read the article

  • New Release of Oracle EPM (Enterprise Performance Management)

    - by Theresa Hickman
    I'm a huge fan of Hyperion products and consider Hyperion to be one of the best acquisitions Oracle has made in terms of applications. So I am really excited to talk about their latest release, Release 11.1.2 of the Oracle EPM System. This is EPM's largest release in 2 years, and it's jam-packed with new modules and features. In terms of brand new products, there are three: 1. Public Sector Planning and Budgeting meets the needs of public sector agencies, higher education, governments, etc. that have complex budget requirements. It supports position or employee-based budgeting and integrates with MS Office and your ERP ledgers to perform commitment control. 2. Hyperion Financial Close Management is a complete financial close solution that orchestrates the entire close process from subledgers and general ledger to financial reporting and disclosure submissions. And of course, it is integrated with GL systems and consolidation systems. I saw a demo of this and it looked pretty slick. They have this unified close calendar that looks like a regular calendar that gives each person participating in the close process a task list. It comes with a Gantt chart that shows the relationships and dependencies among closing tasks. There are dashboards to allow you to track the close progress and completion of tasks as well as perform trend analysis and see how much time is being spent on different activities in the close process. This gives you visibility that you never had before to understand where the bottlenecks are and where improvements could be made. I think what I liked best about this product was that it provides a central place for all participants to communicate their progress. When I worked as an Accountant, we used ad hoc tools, such as spreadsheets, Word documents, emails, and phone calls during the close process. I like the idea of having a central system to track the overall progress as well as automate the entire financial close process. Who knows, maybe Accountants won't have to revolve their lives around the month end close anymore with a tool like this. Those periodic fire drills can become predictable, well managed processes. 3. Disclosure Management is an out-of-the-box, pre-packaged XBRL solution to meet statutory reporting requirements. This product is really going to help companies improve the timeliness of producing financial reports. Reports can be authored using MS Word and Excel and then XBRL instance documents can be produced with its embedded XBRL tags. It even supports footnotes and disclosures of non-financial information. With a product like this, companies no longer have to outsource their XBRL filing; they can bring it back in house to save costs and time. In terms of other enhancements, they have ERP Integrator that provides integration and drill downs from Hyperion products to source systems, such as Oracle E-Business Suite, PeopleSoft, and SAP. No other vendor offers this level of integration. There's also a new product that links Oracle Essbase directly to Hyperion Financial Management for internal financial reporting, and new integrations between Hyperion Financial Management and Oracle's GRC products. They also improved the usability of Oracle Hyperion Planning. They made it much easier for end users to use the system via the web or via MS Excel when submitting plans and budgets. It is also integrated with intelligent approval workflows that are data-driven, user-configurable, and scenario-specific to efficiently streamline the budgeting process. Here's the press release from April 7, 2010. Here's the pre-recorded web cast where you can see the demos. Just register and watch the hour long presentation. And finally, here's the newsletter

    Read the article

  • Exception Handling Frequency/Log Detail

    - by Cyborgx37
    I am working on a fairly complex .NET application that interacts with another application. Many single-line statements are possible culprits for throwing an Exception and there is often nothing I can do to check the state before executing them to prevent these Exceptions. The question is, based on best practices and seasoned experience, how frequently should I lace my code with try/catch blocks? I've listed three examples below, but I'm open to any advice. I'm really hoping to get some pros/cons of various approaches. I can certainly come up with some of my own (greater log granularity for the O-C approach, better performance for the Monolithic approach), so I'm looking for experience over opinion. EDIT: I should add that this application is a batch program. The only "recovery" necessary in most cases is to log the error, clean up gracefully, and quit. So this could be seen to be as much a question of log granularity as exception handling. In my mind's eye I can imagine good reasons for both, so I'm looking for some general advice to help me find an appropriate balance. Monolitich Approach class Program{ public static void Main(){ try{ Step1(); Step2(); Step3(); } catch (Exception e) { Log(e); } finally { CleanUp(); } } public static void Step1(){ ExternalApp.Dangerous1(); ExternalApp.Dangerous2(); } public static void Step2(){ ExternalApp.Dangerous3(); ExternalApp.Dangerous4(); } public static void Step3(){ ExternalApp.Dangerous5(); ExternalApp.Dangerous6(); } } Delegated Approach class Program{ public static void Main(){ try{ Step1(); Step2(); Step3(); } finally { CleanUp(); } } public static void Step1(){ try{ ExternalApp.Dangerous1(); ExternalApp.Dangerous2(); } catch (Exception e) { Log(e); throw; } } public static void Step2(){ try{ ExternalApp.Dangerous3(); ExternalApp.Dangerous4(); } catch (Exception e) { Log(e); throw; } } public static void Step3(){ try{ ExternalApp.Dangerous5(); ExternalApp.Dangerous6(); } catch (Exception e) { Log(e); throw; } } } Obsessive-Compulsive Approach class Program{ public static void Main(){ try{ Step1(); Step2(); Step3(); } finally { CleanUp(); } } public static void Step1(){ try{ ExternalApp.Dangerous1(); } catch (Exception e) { Log(e); throw; } try{ ExternalApp.Dangerous2(); } catch (Exception e) { Log(e); throw; } } public static void Step2(){ try{ ExternalApp.Dangerous3(); } catch (Exception e) { Log(e); throw; } try{ ExternalApp.Dangerous4(); } catch (Exception e) { Log(e); throw; } } public static void Step3(){ try{ ExternalApp.Dangerous5(); } catch (Exception e) { Log(e); throw; } try{ ExternalApp.Dangerous6(); } catch (Exception e) { Log(e); throw; } } } Other approaches welcomed and encouraged. Above are examples only.

    Read the article

  • Remote Desktop Client Crashes following domain join

    - by Roberto Charlie Ciarleglio
    I recently joined my laptop to our windows domain and now the remote desktop client crashes when i try and connect to any machine. It works if I run as administrator but not ordinarily. The domain join migrated my local profile to the domain profile which i think is where the problem lies. I'm guessing its a permission thing as I had a similar problem with dropbox and had to delete reg keys and reinstall. I can't figure out how to fix this problem though. The event viewer shows this: Faulting application name: mstsc.exe, version: 6.1.7601.17514, time stamp: 0x4ce7ab44 Faulting module name: FACredProv2.dll, version: 2.4.95.1, time stamp: 0x4bb8d766 Exception code: 0xc0000005 Fault offset: 0x00000000000025b2 Faulting process id: 0xb24 Faulting application start time: 0x01cd43fbd3a81fba Faulting application path: C:\Windows\System32\mstsc.exe Faulting module path: C:\Windows\System32\FACredProv2.dll Report Id: 154ee55a-afef-11e1-a443-b8ac6f704c5d any help would be appreciated!

    Read the article

  • Middle Mouse Button does not work in XFCE / Arch Linux

    - by Alp
    I have the XFCE desktop manager installed on my Arch Linux system. With E17 (Enlightenment desktop manager) i had no problems with my mouse: all buttons worked correctly out of the box. But in XFCE my middle mouse button does not fire an event at all (no output with xev). Evdev seems to identify my mouse correctly (Razer Deathadder) because it echoes its name in the xorg logs. I have no idea what could cause this and how to debug the problem. I start both e17 and xfce with startx. Here is my ~/.xinitrc: exec startxfce4 --with-ck-launch #exec enlightenment-start

    Read the article

  • Issue with IIS...SERVER APPLICATION UNAVAILABLE

    - by SVI
    My web application was running absolutely fine. Just 2 days back, I got an error saying. SERVER APPLICATION UNAVAILABLE I am pretty certain that nothing was changed on IIS. Unless my automatic Windows updates screwed it up completely. My event viewer had zillions of following errors in Application category. aspnet_wp.exe could not be started. The error code for the failure is C0000005. This error can be caused when the worker process account has insufficient rights to read the .NET Framework files. Please ensure that the .NET Framework is correctly installed and that the ACLs on the installation directory allow access to the configured account. I reinstalled IIS. After installing, i ran aspnet_regiis -i for framework v2 and now it throws error saying - The application could not be initialized properly. Any ideas what going on?

    Read the article

  • "Has Oracle written the script for CRM success?" - Anthony Lye on Customer Experience at BAFTA

    - by Richard Lefebvre
    Anthony Lye showcased Oracle Fusion CRM at a BAFTA gathering, and MyCustomer.com covered the story under the title of "Has Oracle written the script for CRM success?' According to MyCustomer.com, "Oracle's SVP of CRM Anthony Lye set the scene for the event, suggesting products are becoming commoditized, so that the only way to differentiate is through the relationship with the customer. But he warned that "customers are more and more in control of that relationship, so you have to provide great experiences for them." "The quickest win within your organization to create a single view is to connect your marketing organization with your selling organization, align goals, processes, people and technology," Anthony explained.   "And this is a transition that is already happening - "VPs of marketing have started turning up in the same meetings as VPs of sales, we have started to see that they want to work together" - but this convergence needs nurturing." "In Fusion there are capabilities to align the organisation - we enable marketing on the same platform to build campaigns connected to sales stages. It can affect leads and opportunities at the top end of the funnel. And the selling organisation can take advantage of marketing content - the materials that are exclusively within marketing can now be used by sales. Your sales teams have been campaigning forever, but it's usually by email, it isn't aligned with the corporate message and it's being sent to people it shouldn't. By aligning them we can increase output and the quality of that output." Anthony concluded: "Operating in a disconnected fashion having two distinct systems will cost you time and money. So we feel there's a material advantage in a solution like this." Enjoy the full story at http://www.mycustomer.com/topic/marketing/has-oracle-written-script-crm-success/139958

    Read the article

  • ArchBeat Link-o-Rama for 2012-06-19

    - by Bob Rhubart
    Discussion: Public, Private, and Hybrid Clouds A conversation about the similarities and differences between public, private, and hybrid clouds; the connection between cows, condos, and cloud computing; and what architects need to know in order to take advantage of cloud computing. (OTN ArchBeat Podcast transcript) InfoQ: Current Trends in Enterprise Mobility Interesting infographics that show current developments and major trends in enterprise mobility. Recap: EMEA User Group Leaders Meeting Latvia May 2012 Tom Scheirsen recaps the recent IOUC event in Riga. Oracle Fusion Middleware Summer Camps in Lisbon: Includes Advanced ADF Training by Oracle Product Management This is how IT people deal with the Summertime Blues. Enterprise 2.0 Conference: Building Social Business | Oracle WebCenter Blog Kellsey Ruppel shares a list of E2.0 conference sessions being presented by members of the Oracle community. Linux 6 Transparent Huge Pages and Hadoop Workloads | Structured Data Greg Rahn documents a problem. BPM Standard Edition to start your BPM project "BPM Standard Edition is an entry level BPM offering designed to help organisations implement their first few processes in order to prove the value of BPM within their own organisation." Troubleshooting ADF Security 11g Login Page Failure | Andrejus Baranovskis Oracle ACE Director Andrejus Baranovskis takes a deep dive into one of the most common ADF 11g Security issues. It's Alive! - The Oracle OpenWorld Content Catalog It's what you’ve been waiting for—the central repository for information on sessions, demos, labs, user groups, exhibitors, and more. 5 minutes or less: Indexing Attributes in OID | Andre Correa Fusion Middleware A-Team blogger Andre Correa offers help for those who encounter issues when running searches with LDAP filters against OID (Oracle Internet Directory). Condos and Clouds: Thinking about Cloud Computng by Looking at Condominiums | Pat Helland In part two of the OTN ArchBeat Podcast Public, Private, and Hybrid Clouds, Oracle Cloud chief architect Mark Nelson mentions an analogy by Pat Helland that compares condos to cloud computing. After some digging I found the October 2011 presentation in which Helland explains that analogy. Thought for the Day "I have always found that plans are useless, but planning is indispensable." — Dwight Eisenhower (October 14, 1890 – March 28, 1969) Source: Quotes for Software Engineers

    Read the article

  • Java Spotlight Episode 103: 2012 Duke Choice Award Winners

    - by Roger Brinkley
    Our annual interview with the 2012 Duke Choice Award Winners recorded live at the JavaOne 2012. Right-click or Control-click to download this MP3 file. You can also subscribe to the Java Spotlight Podcast Feed to get the latest podcast automatically. If you use iTunes you can open iTunes and subscribe with this link:  Java Spotlight Podcast in iTunes. Show Notes Events Oct 13, Devoxx 4 Kids Nederlands Oct 15-17, JAX London Oct 20, Devoxx 4 Kids Français Oct 22-23, Freescale Technology Forum - Japan, Tokyo Oct 30-Nov 1, Arm TechCon, Santa Clara Oct 31, JFall, Netherlands Nov 2-3, JMagreb, Morocco Nov 13-17, Devoxx, Belgium Feature Interview Duke Choice Award Winners 2012 - Show Presentation London Java CommunityThe second user group receiving a Duke’s Choice Award this year, the London Java Community (LJC) and its users have been active in the OpenJDK, the Java Community Process (JCP) and other efforts within the global Java community. Student Nokia Developer GroupThis year’s student winner, Ram Kashyap, is the founder and president of the Nokia Student Network, and was profiled in the “The New Java Developers” feature in the March/April 2012 issue of Java Magazine. Since then, Ram has maintained a hectic pace, graduating from the People’s Education Society Institute of Technology in Bangalore, India, while working on a Java mobile startup and training students on Java ME. Jelastic, Inc.Moving existing Java applications to the cloud can be a daunting task, but startup Jelastic, Inc. offers the first all-Java platform-as-a-service (PaaS) that enables existing Java applications to be deployed in the cloud without code changes or lock-in. NATOThe first-ever Community Choice Award goes to the MASE Integrated Console Environment (MICE) in use at NATO. Built in Java on the NetBeans platform, MICE provides a high-performance visualization environment for conducting air defense and battle-space operations. DuchessRather than focus on a specific geographic area like most Java User Groups (JUGs), Duchess fosters the participation of women in the Java community worldwide. The group has more than 500 members in 60 countries, and provides a platform through which women can connect with each other and get involved in all aspects of the Java community. AgroSense ProjectImproving farming methods to feed a hungry world is the goal of AgroSense, an open source farm information management system built in Java and the NetBeans platform. AgroSense enables farmers, agribusinesses, suppliers and others to develop modular applications that will easily exchange information through a common underlying NetBeans framework. Apache Software Foundation Hadoop ProjectThe Apache Software Foundation’s Hadoop project, written in Java, provides a framework for distributed processing of big data sets across clusters of computers, ranging from a few servers to thousands of machines. This harnessing of large data pools allows organizations to better understand and improve their business. Parleys.comE-learning specialist Parleys.com, based in Brussels, Belgium, uses Java technologies to bring online classes and full IT conferences to desktops, laptops, tablets and mobile devices. Parleys.com has hosted more than 1,700 conferences—including Devoxx and JavaOne—for more than 800,000 unique visitors. Winners not presenting at JavaOne 2012 Duke Choice Awards BOF Liquid RoboticsRobotics – Liquid Robotics is an ocean data services provider whose Wave Glider technology collects information from the world’s oceans for application in government, science and commercial applications. The organization features the “father of Java” James Gosling as its chief software architect.United Nations High Commissioner for RefugeesThe United Nations High Commissioner for Refugees (UNHCR) is on the front lines of crises around the world, from civil wars to natural disasters. To help facilitate its mission of humanitarian relief, the UNHCR has developed a light-client Java application on the NetBeans platform. The Level One registration tool enables the UNHCR to collect information on the number of refugees and their water, food, housing, health, and other needs in the field, and combines that with geocoding information from various sources. This enables the UNHCR to deliver the appropriate kind and amount of assistance where it is needed.

    Read the article

  • SQL Server Restore from Backup, Just primary File Group

    - by bladefist
    Thankfully, this question is just a what-if, and I am not in an emergency right now. But I have created a file group in my database (sql server 2008), and moved some massive data tables over to it. Leaving my websites central tables in the Primary file group. In the event of a restore, can I restore just the primary file group, and have a working database? Or do I have to restore both file groups? I don't want my site down for ages while it restores the 2nd file group.

    Read the article

< Previous Page | 685 686 687 688 689 690 691 692 693 694 695 696  | Next Page >