Search Results

Search found 13889 results on 556 pages for 'results'.

Page 67/556 | < Previous Page | 63 64 65 66 67 68 69 70 71 72 73 74  | Next Page >

  • Which index is used in select and why?

    - by Lukasz Lysik
    I have the table with zip codes with following columns: id - PRIMARY KEY code - NONCLUSTERED INDEX city When I execute query SELECT TOP 10 * FROM ZIPCodes I get the results sorted by id column. But when I change the query to: SELECT TOP 10 id FROM ZIPCodes I get the results sorted by code column. Again, when I change the query to: SELECT TOP 10 code FROM ZIPCodes I get the results sorted by code column again. And finally when I change to: SELECT TOP 10 id,code FROM ZIPCodes I get the results sorted by id column. My question is in the title of the question. I know which indexes are used in the queries, but my question is, why those indexes are used? I the second query (SELECT TOP 10 id FROM ZIPCodes) wouldn't it be faster if the clusteder index was used? How the query engine chooses which index to use?

    Read the article

  • Is there a way to get each row's value from a database into an array?

    - by Guyver
    Say I have a query like the one below. What would be the best way to put each value into an array if I don't know how many results there will be? Normally I would do this with a loop, but I have no idea how many results there are. Would I need run another query to count the results first? <CFQUERY name="alllocations" DATASOURCE="#DS#"> SELECT locationID FROM tblProjectLocations WHERE projectID = '#ProjectName#' </CFQUERY>

    Read the article

  • Can I use a single instance of a delegate to start multiple Asynchronous Requests?

    - by RobV
    Just wondered if someone could clarify the use of BeginInvoke on an instance of some delegate when you want to make multiple asynchronous calls since the MSDN documentation doesn't really cover/mention this at all. What I want to do is something like the following: MyDelegate d = new MyDelegate(this.TargetMethod); List<IAsyncResult> results = new List<IAsyncResult>(); //Start multiple asynchronous calls for (int i = 0; i < 4; i++) { results.Add(d.BeginInvoke(someParams, null, null)); } //Wait for all my calls to finish WaitHandle.WaitAll(results.Select(r => r.AsyncWaitHandle).ToArray()); //Process the Results The question is can I do this with one instance of the delegate or do I need an instance of the delegate for each individual call? Given that EndInvoke() takes an IAsyncResult as a parameter I would assume that the former is correct but I can't see anything in the documentation to indicate either way.

    Read the article

  • What's wrong with my using pdo this way?

    - by user198729
    $sql = "SELECT * FROM table ORDER BY :sort :dir LIMIT :start, :results"; $stmt = $dbh->prepare($sql); $stmt->execute(array( 'sort' => $_GET['sort'], 'dir' => $_GET['dir'], 'start' => $_GET['start'], 'results' => $_GET['results'], ) ); I tried to use prepare to do the job,but $stmt->fetchAll(PDO::FETCH_ASSOC); returns nothing.

    Read the article

  • DataBind DataSource Result to ASP Labels

    - by Steven
    I have an AccessDataSource TestSummaryADS. I can easily view the results in a GridView or DropDownList by setting its DataSourceID property. How do I bind a value from the results TestSummaryADS to the text of a label? I'm just trying to populate labels on my page with results from the DB entry. C# or VB.NET answers okay.

    Read the article

  • How to tell if there is an available thread in a thread pool in java

    - by Gormcito
    I am trying to proccess a queue of tasks from a database table as fast as possible while also limiting the number of threads to process the tasks. I am using a fixed sized thread pool with Executors.newFixedThreadPool(N); I want to know if there is a way of knowing if the thread pool is full, by that I mean are there currently 50 threads running, if so then I'll wait for a thread to be available before starting a new one instead of sleeping the main thread. Code of what I would like to do: ExecutorService executor = Executors.newFixedThreadPool(N); ResultSet results; while( true ) { results = getWaitingTasksStmt.executeQuery(); while( results.next() && executor.notFull() ) { executor.submit( new thread( new runnableInheritedClass(results) ) ); } }

    Read the article

  • Very weird jquery/json problem...

    - by Scarface
    Hey guys I have finally located the cause of this problem...I just have no idea how to fix it and why it is happening. I have a jquery getjson function and it returns 0 results every 2-5 clicks on new topics or refreshes. For some reason if I change my query to sort results by ASC it always returns results and much quicker, but this poses a problem since I need results by DESC. If anyone has any ideas, I would greatly appreciate it because I am dumbfounded at this point. Here is my query but I need it by DESC SELECT time, user, message FROM comments WHERE topic_id='$topic_id' ORDER BY time ASC LIMIT 10 $.getJSON(files+"comments.php?action=view&load=initial&topic_id="+topic_id+"&t=" + (new Date()), function(json) { if(json.length) { for(i=0; i < json.length; i++) { $('#comment-list').prepend(prepare(json[i])); $('#list-' + count).fadeIn(1500); } } }); I return results like so while($row = mysql_fetch_array($res)){ $data[] = $row; } $out = json_encode($data); print $out;

    Read the article

  • does TFS easily allow you to share testresults with people who don't have TFS?

    - by jcollum
    I'd like to publish a set of test results to a place where team members who don't have TFS can see the results. Does TFS do anything like this easily? I see that that test results do get published, but without a TFS license I don't see any way for people to view them. I think I can do this easily by pulling the XML from the test results and then pushing that out to a central location, but I was wondering if there's anything like an "official" way to do this. TFS is hard to work with when you have some people who have access to it and others who don't... I really wish there was some sort of read-only access to the server (let people get files but not check them out for instance).

    Read the article

  • How to log SQL output to text file on client from C#

    - by Rob Packwood
    I have a large auditing stored procedure that prints values and runs some SELECT statements. When running within SQL Management Studio we have the use select to display "Results to Text" so all of the SQL results and print statement display in one place. Now I need to have some C# code also call this auditing procedure at the end of the process and basically store all data that would be in the "Results to Text" window into a .txt file. How can this be done?

    Read the article

  • What does the `new` keyword do

    - by Mike
    I'm following a Java tutorial online, trying to learn the language, and it's bouncing between two semantics for using arrays. long results[] = new long[3]; results[0] = 1; results[1] = 2; results[2] = 3; and: long results[] = {1, 2, 3}; The tutorial never really mentioned why it switched back and forth between the two so I searched a little on the topic. My current understanding is that the new operator is creating an object of "array of longs" type. What I do not understand is why do I want that, and what are the ramifications of that? Are there certain "array" specific methods that won't work on an array unless it's an "array object"? Is there anything that I can't do with an "array object" that I can do with a normal array? Does the Java VM have to do clean up on objects initialized with the new operator that it wouldn't normally have to do? I'm coming from C, so my Java terminology, may not be correct here, so please ask for clarification if something's not understandable.

    Read the article

  • How to make shell output redirect (>) write while script is still running?

    - by Noio
    I wrote a short script that never terminates. This script continuously generates output that I have to check on every now and then. I'm running it on a lab computer through SSH, and redirecting the output to a file in my public_html folder on that machine. python script.py > ~/public_html/results.txt However, the results don't show up immediately when I refresh the address. The results show up when I terminate the program, but as I said, it doesn't halt by itself. Is that redirect (>) being lazy with with writing? Is there a way to continuously (or with an interval) update the results in the file? Or is it the webserver that doesn't update the file while it is still being written?

    Read the article

  • mysql like issue on partial match

    - by ubercooluk
    Im having a mysql query like this SELECT group_name FROM t_groups WHERE group_name LIKE '%PCB%'; The results are group_name ------------ PCB Full size PCB Another query, SELECT group_name FROM t_groups WHERE group_name LIKE '%PCB-123%'; group_name ----------- PCB-123 How can i use a query that will show all the three results ?,I mean i need to get all the results that starts or contains PCB

    Read the article

  • Imperative Programming v/s Declarative Programming v/s Functional Programming

    - by kaleidoscope
    Imperative Programming :: Imperative programming is a programming paradigm that describes computation in terms of statements that change a program state. In much the same way as the imperative mood in natural languages expresses commands to take action, imperative programs define sequences of commands for the computer to perform. The focus is on what steps the computer should take rather than what the computer will do (ex. C, C++, Java). Declarative Programming :: Declarative programming is a programming paradigm that expresses the logic of a computation without describing its control flow. It attempts to minimize or eliminate side effects by describing what the program should accomplish, rather than describing how to go about accomplishing it. The focus is on what the computer should do rather than how it should do it (ex. SQL). A  C# example of declarative v/s. imperative programming is LINQ. With imperative programming, you tell the compiler what you want to happen, step by step. For example, let's start with this collection, and choose the odd numbers: List<int> collection = new List<int> { 1, 2, 3, 4, 5 }; With imperative programming, we'd step through this, and decide what we want: List<int> results = new List<int>(); foreach(var num in collection) {     if (num % 2 != 0)           results.Add(num); } Here’s what we are doing: *Create a result collection *Step through each number in the collection *Check the number, if it's odd, add it to the results With declarative programming, on the other hand, we write the code that describes what you want, but not necessarily how to get it var results = collection.Where( num => num % 2 != 0); Here, we're saying "Give us everything where it's odd", not "Step through the collection. Check this item, if it's odd, add it to a result collection." Functional Programming :: Functional programming is a programming paradigm that treats computation as the evaluation of mathematical functions and avoids state and mutable data. It emphasizes the application of functions.Functional programming has its roots in the lambda calculus. It is a subset of declarative languages that has heavy focus on recursion. Functional programming can be a mind-bender, which is one reason why Lisp, Scheme, and Haskell have never really surpassed C, C++, Java and COBOL in commercial popularity. But there are benefits to the functional way. For one, if you can get the logic correct, functional programming requires orders of magnitude less code than imperative programming. That means fewer points of failure, less code to test, and a more productive (and, many would say, happier) programming life. As systems get bigger, this has become more and more important. To know more : http://stackoverflow.com/questions/602444/what-is-functional-declarative-and-imperative-programming http://msdn.microsoft.com/en-us/library/bb669144.aspx http://en.wikipedia.org/wiki/Imperative_programming   Technorati Tags: Ranjit,Imperative Programming,Declarative programming,Functional Programming

    Read the article

  • User Story or User Stories for this specific requirement

    - by Maximus
    I have to write a user story for a requirement that involves passing search filters to the same URI and retrieving corresponding results. I have 5 filters. I plan to write 5 different stories of type: As a URI user I can search by #filter1 so that I can retrieve results based on #filter1. And then a 6th story that involves searching one or more or all six filters in conjunction. Is this is a sensible route to take?

    Read the article

  • Tom Cruise: Meet Fusion Apps UX and Feel the Speed

    - by ultan o'broin
    Unfortunately, I am old enough to remember, and now to admit that I really loved, the movie Top Gun. You know the one - Tom Cruise, US Navy F-14 ace pilot, Mr Maverick, crisis of confidence, meets woman, etc., etc. Anyway, one of more memorable lines (there were a few) was: "I feel the need, the need for speed." I was reminded of Tom Cruise recently. Paraphrasing a certain Senior Vice President talking about Oracle Fusion Applications and user experience at an all-hands meeting, I heard that: Applications can never be too easy to use. Performance can never be too fast. Developers, assume that your code is always "on". Perfect. You cannot overstate the user experience importance of application speed to users, or at least their perception of speed. We all want that super speed of execution and performance, and increasingly so as enterprise users bring the expectations of consumer IT into the work environment. Sten Vesterli (@stenvesterli), an Oracle Fusion Applications User Experience Advocate, also addressed the speed point artfully at an Oracle Usability Advisory Board meeting in Geneva. Sten asked us that when we next Googled something, to think about the message we see that Google has found hundreds of thousands or millions of results for us in a split second (for example, About 8,340,000 results (0.23 seconds)). Now, how many results can we see and how many can we use immediately? Yet, this simple message communicating the total results available to us works a special magic about speed, delight, and excitement that Google has made its own in the search space. And, guess what? The Oracle Application Development Framework table component relies on a similar "virtual performance boost", says Sten, when it displays the first 50 records in a table, and uses a scrollbar indicating the total size of the data record set. The user scrolls and the application automatically retrieves more records as needed. Application speed and its perception by users is worth bearing in mind the next time you're at a customer site and the IT Department demands that you retrieve every record from the database. Just think of... Dave Ensor: I'll give you all the rows you ask for in one second. If you promise to use them. (Again, hat tip to Sten.) And then maybe think of... Tom Cruise. And if you want to read about the speed of Oracle Fusion Applications, and what that really means in terms of user productivity for your entire business, then check out the Oracle Applications User Experience Oracle Fusion Applications white papers on the usable apps website.

    Read the article

  • SQL SERVER – DVM sys.dm_os_sys_info Column Name Changed in SQL Server 2012

    - by pinaldave
    Have you ever faced situation where something does not work and when you try to go and fix it – you like fixing it and started to appreciate the breaking changes. Well, this is exactly I felt yesterday. Before I begin my story of yesterday I want to state it candidly that I do not encourage anybody to use * in the SELECT statement. One of the my DBA friend who always used my performance tuning script yesterday sent me email asking following question - “Every time I want to retrieve OS related information in SQL Server – I used DMV sys.dm_os_sys_info. I just upgraded my SQL Server edition from 2008 R2 to SQL Server 2012 RC0 and it suddenly stopped working. Well, this is not the production server so the issue is not big yet but eventually I need to resolve this error. Any suggestion?” The funny thing was original email was very long but it did not talk about what is the exact error beside the query is not working. I think this is the disadvantage of being too friendly on email sometime. Well, never the less, I quickly looked at the DMV on my SQL Server 2008 R2 and SQL Server 2012 RC0 version. To my surprise I found out that there were few columns which are renamed in SQL Server 2012 RC0. Usually when people see breaking changes they do not like it but when I see these changes I was happy as new names were meaningful and additionally their new conversion is much more practical and useful. Here are the columns previous names - Previous Column Name New Column Name physical_memory_in_bytes physical_memory_kb bpool_commit_target committed_target_kb bpool_visible visible_target_kb virtual_memory_in_bytes virtual_memory_kb bpool_commited committed_kb If you read it carefully you will notice that new columns now display few results in the KB whereas earlier result was in bytes. When I see the results in bytes I always get confused as I could not guess what exactly it will convert to. I like to see results in kb and I am glad that new columns are now displaying the results in the kb. I sent the details of the new columns to my friend and ask him to check the columns used in application. From my comment he quickly realized why he was facing error and fixed it immediately. Overall – all was well at the end and I learned something new. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL DMV, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • World Record Performance on PeopleSoft Enterprise Financials Benchmark on SPARC T4-2

    - by Brian
    Oracle's SPARC T4-2 server achieved World Record performance on Oracle's PeopleSoft Enterprise Financials 9.1 executing 20 Million Journals lines in 8.92 minutes on Oracle Database 11g Release 2 running on Oracle Solaris 11. This is the first result published on this version of the benchmark. The SPARC T4-2 server was able to process 20 million general ledger journal edit and post batch jobs in 8.92 minutes on this benchmark that reflects a large customer environment that utilizes a back-end database of nearly 500 GB. This benchmark demonstrates that the SPARC T4-2 server with PeopleSoft Financials 9.1 can easily process 100 million journal lines in less than 1 hour. The SPARC T4-2 server delivered more than 146 MB/sec of IO throughput with Oracle Database 11g running on Oracle Solaris 11. Performance Landscape Results are presented for PeopleSoft Financials Benchmark 9.1. Results obtained with PeopleSoft Financials Benchmark 9.1 are not comparable to the the previous version of the benchmark, PeopleSoft Financials Benchmark 9.0, due to significant change in data model and supports only batch. PeopleSoft Financials Benchmark, Version 9.1 Solution Under Test Batch (min) SPARC T4-2 (2 x SPARC T4, 2.85 GHz) 8.92 Results from PeopleSoft Financials Benchmark 9.0. PeopleSoft Financials Benchmark, Version 9.0 Solution Under Test Batch (min) Batch with Online (min) SPARC Enterprise M4000 (Web/App) SPARC Enterprise M5000 (DB) 33.09 34.72 SPARC T3-1 (Web/App) SPARC Enterprise M5000 (DB) 35.82 37.01 Configuration Summary Hardware Configuration: 1 x SPARC T4-2 server 2 x SPARC T4 processors, 2.85 GHz 128 GB memory Storage Configuration: 1 x Sun Storage F5100 Flash Array (for database and redo logs) 2 x Sun Storage 2540-M2 arrays and 2 x Sun Storage 2501-M2 arrays (for backup) Software Configuration: Oracle Solaris 11 11/11 SRU 7.5 Oracle Database 11g Release 2 (11.2.0.3) PeopleSoft Financials 9.1 Feature Pack 2 PeopleSoft Supply Chain Management 9.1 Feature Pack 2 PeopleSoft PeopleTools 8.52 latest patch - 8.52.03 Oracle WebLogic Server 10.3.5 Java Platform, Standard Edition Development Kit 6 Update 32 Benchmark Description The PeopleSoft Enterprise Financials 9.1 benchmark emulates a large enterprise that processes and validates a large number of financial journal transactions before posting the journal entry to the ledger. The validation process certifies that the journal entries are accurate, ensuring that ChartFields values are valid, debits and credits equal out, and inter/intra-units are balanced. Once validated, the entries are processed, ensuring that each journal line posts to the correct target ledger, and then changes the journal status to posted. In this benchmark, the Journal Edit & Post is set up to edit and post both Inter-Unit and Regular multi-currency journals. The benchmark processes 20 million journal lines using AppEngine for edits and Cobol for post processes. See Also Oracle PeopleSoft Benchmark White Papers oracle.com SPARC T4-2 Server oracle.com OTN PeopleSoft Financial Management oracle.com OTN Oracle Solaris oracle.com OTN Oracle Database 11g Release 2 Enterprise Edition oracle.com OTN Disclosure Statement Copyright 2012, Oracle and/or its affiliates. All rights reserved. Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners. Results as of 1 October 2012.

    Read the article

  • Windows Azure Virtual Machine Readiness and Capacity Assessment for SQL Server

    - by SQLOS Team
    Windows Azure Virtual Machine Readiness and Capacity Assessment for Windows Server Machine Running SQL Server With the release of MAP Toolkit 8.0 Beta, we have added a new scenario to assess your Windows Azure Virtual Machine Readiness. The MAP 8.0 Beta performs a comprehensive assessment of Windows Servers running SQL Server to determine you level of readiness to migrate an on-premise physical or virtual machine to Windows Azure Virtual Machines. The MAP Toolkit then offers suggested changes to prepare the machines for migration, such as upgrading the operating system or SQL Server. MAP Toolkit 8.0 Beta is available for download here Your participation and feedback is very important to make the MAP Toolkit work better for you. We encourage you to participate in the beta program and provide your feedback at [email protected] or through one of our surveys. Now, let’s walk through the MAP Toolkit task for completing the Windows Azure Virtual Machine assessment and capacity planning. The tasks include the following: Perform an inventory View the Windows Azure VM Readiness results and report Collect performance data for determine VM sizing View the Windows Azure Capacity results and report Perform an inventory: 1. To perform an inventory against a single machine or across a complete environment, choose Perform an Inventory to launch the Inventory and Assessment Wizard as shown below: 2. After the Inventory and Assessment Wizard launches, select either the Windows computers or SQL Server scenario to inventory Windows machines. HINT: If you don’t care about completely inventorying a machine, just select the SQL Server scenario. Click Next to Continue. 3. On the Discovery Methods page, select how you want to discover computers and then click Next to continue. Description of Discovery Methods: Use Active Directory Domain Services -- This method allows you to query a domain controller via the Lightweight Directory Access Protocol (LDAP) and select computers in all or specific domains, containers, or OUs. Use this method if all computers and devices are in AD DS. Windows networking protocols --  This method uses the WIN32 LAN Manager application programming interfaces to query the Computer Browser service for computers in workgroups and Windows NT 4.0–based domains. If the computers on the network are not joined to an Active Directory domain, use only the Windows networking protocols option to find computers. System Center Configuration Manager (SCCM) -- This method enables you to inventory computers managed by System Center Configuration Manager (SCCM). You need to provide credentials to the System Center Configuration Manager server in order to inventory the managed computers. When you select this option, the MAP Toolkit will query SCCM for a list of computers and then MAP will connect to these computers. Scan an IP address range -- This method allows you to specify the starting address and ending address of an IP address range. The wizard will then scan all IP addresses in the range and inventory only those computers. Note: This option can perform poorly, if many IP addresses aren’t being used within the range. Manually enter computer names and credentials -- Use this method if you want to inventory a small number of specific computers. Import computer names from a files -- Using this method, you can create a text file with a list of computer names that will be inventoried. 4. On the All Computers Credentials page, enter the accounts that have administrator rights to connect to the discovered machines. This does not need to a domain account, but needs to be a local administrator. I have entered my domain account that is an administrator on my local machine. Click Next after one or more accounts have been added. NOTE: The MAP Toolkit primarily uses Windows Management Instrumentation (WMI) to collect hardware, device, and software information from the remote computers. In order for the MAP Toolkit to successfully connect and inventory computers in your environment, you have to configure your machines to inventory through WMI and also allow your firewall to enable remote access through WMI. The MAP Toolkit also requires remote registry access for certain assessments. In addition to enabling WMI, you need accounts with administrative privileges to access desktops and servers in your environment. 5. On the Credentials Order page, select the order in which want the MAP Toolkit to connect to the machine and SQL Server. Generally just accept the defaults and click Next. 6. On the Enter Computers Manually page, click Create to pull up at dialog to enter one or more computer names. 7. On the Summary page confirm your settings and then click Finish. After clicking Finish the inventory process will start, as shown below: Windows Azure Readiness results and report After the inventory progress has completed, you can review the results under the Database scenario. On the tile, you will see the number of Windows Server machine with SQL Server that were analyzed, the number of machines that are ready to move without changes and the number of machines that require further changes. If you click this Azure VM Readiness tile, you will see additional details and can generate the Windows Azure VM Readiness Report. After the report is generated, select View | Saved Reports and Proposals to view the location of the report. Open up WindowsAzureVMReadiness* report in Excel. On the Windows tab, you can see the results of the assessment. This report has a column for the Operating System and SQL Server assessment and provides a recommendation on how to resolve, if there a component is not supported. Collect Performance Data Launch the Performance Wizard to collect performance information for the Windows Server machines that you would like the MAP Toolkit to suggest a Windows Azure VM size for. Windows Azure Capacity results and report After the performance metrics are collected, the Azure VM Capacity title will display the number of Virtual Machine sizes that are suggested for the Windows Server and Linux machines that were analyzed. You can then click on the Azure VM Capacity tile to see the capacity details and generate the Windows Azure VM Capacity Report. Within this report, you can view the performance data that was collected and the Virtual Machine sizes.   MAP Toolkit 8.0 Beta is available for download here Your participation and feedback is very important to make the MAP Toolkit work better for you. We encourage you to participate in the beta program and provide your feedback at [email protected] or through one of our surveys. Useful References: Windows Azure Homepage How to guides for Windows Azure Virtual Machines Provisioning a SQL Server Virtual Machine on Windows Azure Windows Azure Pricing     Peter Saddow Senior Program Manager – MAP Toolkit Team

    Read the article

  • Benefits of Using Both SEO Techniques and Google AdWords

    As the name suggests, Search Engine Optimization is the process by which your web page is optimized to appear in the first few results of Google's search page also known as SERP (Search Engine Results Page). This is not as simple as it sounds unfortunately. To get your page onto the first page you must actively promote your site using legitimate and natural means by which the site gets more visitors, reciprocal links, backlinks and ultimately high quality traffic.

    Read the article

  • Oracle ETPM v2.3.1 Examachine Performance Benchmark Data Sheet

    - by Paula Speranza-Hadley
    Oracle Tax is pleased to announce the exceptional results of the Oracle ETPM v2.3.1 Examachine performance benchmark.   The benchmark achieved the following results:  · Processed8M outpayments and 2M payments in  6 hours · Processed 1M forms in 4 hours · Near  linear scalability of batch processing For the complete data sheet, please click on the following link:  https://blogs.oracle.com/tax/resource/OracleETPMv231ExamachinePerformanceBenchmarkDataSheet.pdf

    Read the article

  • World Record Oracle E-Business Consolidated Workload on SPARC T4-2

    - by Brian
    Oracle set a World Record for the Oracle E-Business Suite Standard Medium multiple-online module benchmark using Oracle's SPARC T4-2 and SPARC T4-4 servers which ran the application and database. Oracle's SPARC T4 servers demonstrate performance leadership and world-record results on Oracle E-Business Suite Applications R12 OLTP benchmark by publishing the first result using multiple concurrent online application modules with Oracle Database 11g Release 2 running Solaris.   This results shows that a multi-tier configuration of SPARC T4 servers running the Oracle E-Business Suite R12.1.2 application and Oracle Database 11g Release 2 is capable of supporting 4,100 online users with outstanding response-times, executing a mix of complex transactions consolidating 4 Oracle E-Business modules (iProcurement, Order Management, Customer Service and HR Self-Service).   The SPARC T4-2 server in the application tier utilized about 65% and the SPARC T4-4 server in the database tier utilized about 30%, providing significant headroom for additional Oracle E-Business Suite R12.1.2 processing modules, more online users, and future growth.   Oracle E-Business Suite Applications were run in Oracle Solaris Containers on SPARC T4 servers and provides a consolidation platform for multiple E-Business instances.   Performance Landscape Multiple Online Modules (Self-Service, Order-Management, iProcurement, Customer-Service) Medium Configuration System Users AverageResponse Time 90th PercentileResponse Time SPARC T4-2 4,100 2.08 sec 2.52 sec Configuration Summary Application Tier Configuration: 1 x SPARC T4-2 server 2 x SPARC T4 processors, 2.85 GHz 256 GB memory 3 x 300 GB internal disks Oracle Solaris 10 Oracle E-Business Suite 12.1.2 Database Tier Configuration: 1 x SPARC T4-4 server 4 x SPARC T4 processors, 3.0 GHz 256 GB memory 2 x 300 GB internal disks Oracle Solaris 10 Oracle Solaris Containers Oracle Database 11g Release 2 Storage Configuration: 1 x Sun Storage F5100 Flash Array (80 x 24 GB flash modules) Benchmark Description The Oracle R12 E-Business Suite Standard Benchmark combines online transaction execution by simulated users with multiple online concurrent modules to model a typical scenario for a global enterprise. The online component exercises the common UI flows which are most frequently used by a majority of our customers. This benchmark utilized four concurrent flows of OLTP transactions, for Order to Cash, iProcurement, Customer Service and HR Self-Service and measured the response times. The selected flows model simultaneous business activities inclusive of managing customers, services, products and employees. See Also Oracle R12 E-Business Suite Standard Benchmark Results Oracle R12 E-Business Suite Standard Benchmark Overview Oracle R12 E-Business Benchmark Description E-Business Suite Applications R2 (R12.1.2) Online Benchmark - Using Oracle Database 11g on Oracle's SPARC T4-2 and Oracle's SPARC T4-4 Servers oracle.com SPARC T4-2 Server oracle.com OTN SPARC T4-4 Server oracle.com OTN Oracle E-Business Suite oracle.com OTN Oracle Solaris oracle.com OTN Oracle Database 11g Release 2 Enterprise Edition oracle.com OTN Disclosure Statement Oracle E-Business Suite R12 medium multiple-online module benchmark, SPARC T4-2, SPARC T4, 2.85 GHz, 2 chips, 16 cores, 128 threads, 256 GB memory, SPARC T4-4, SPARC T4, 3.0 GHz, 4 chips, 32 cores, 256 threads, 256 GB memory, average response time 2.08 sec, 90th percentile response time 2.52 sec, Oracle Solaris 10, Oracle Solaris Containers, Oracle E-Business Suite 12.1.2, Oracle Database 11g Release 2, Results as of 9/30/2012.

    Read the article

  • Increase Google SERP Rank by Improving Your PageRank

    Having a good search engines results page (SERP) rank is the goal of SEO, this means higher organic traffic, more visitors and increased conversion rate. But how do we improve our SERP rank specially with Google who provides more than 70% of relevant search results? Well one of them is improving your website's PageRank, if you think SERP and PR are not related, keep on reading to find out.

    Read the article

  • Generating Report for NUnit

    - by thangchung
     All source codes for this post can be found at my github.Time ago, I received a request that people ask me how they can generate reports of the results of testing using NUnit? In fact, I may never do this. In the little world of my programming, I only care about the test results, red-green-refactoring, and that was it. When I got that question quite a bit unexpected, I knew that I could use NCover to generate reports, but reports of NCover too simple, it did not give us more details on the number of test cases, test methods, ... And I began to see about creating interesting report for NUnit.I was lucky to find an open source here. Its authors call it NUnit2Report, but one disadvantage is it only running on .NET 1.0. Indeed too old compared to the current version 4.0. And I try to download the preview, but I could not run. I had to open its source code and found that it uses XSLT to convert the output of NUnit results from XML to HTML. Nothing really special, because I also knew that after NUnit run output file extension is XML is created. Author only use this file to convert to HTML using XSLT. And I decided to convert it to. NET 4.0, because I will not have to code from scratch. Conversion work made me take some time, but was lucky that I finally have what I want. Thanks Gilles for the this OSS. I will send a mail to thank him for his efforts but put this out for the OSS. Now I will show people how to do it. I used the auto built NAnt and NUnit for running TestCase, and I use Selenium testing framework. After writing three TestCase using Selenium, I ran NUnit, and got the following results: There are 1 fail and 2s success. In the bin directory of this project will have the NUnit output file as shown below: Then I create a build file, and a bat file for easy running (can use PowerShell is here also.) Double click in the bat file to create a report like this:       Finally open the index.html file in the folder to view report. As everyone can see, it is the TestCase and divide very clearly, that I meet the requirements. This is really good. Once again I really thank NUnit2Report from Gilles. People can contact him via the mail address [email protected] or website  http://nunit2report.sourceforge.net. It really is useful to those who promised to QA. Hopefully this post will help anyone really interested in doing reports for NUnit.   

    Read the article

  • ColdFusion Search with Excel Sheet creation

    - by user71602
    I created a form search with two buttons - one for results on the web page and one to create an excel sheet. Problem: When a user conducts a search with results on the web page - the user would have to select the criteria again then click the create excel button to create the sheet. How can I work it, so the user after conducting a search can just click one button to create the excel sheet? (Using "Post" for the form)

    Read the article

< Previous Page | 63 64 65 66 67 68 69 70 71 72 73 74  | Next Page >