Search Results

Search found 40567 results on 1623 pages for 'database performance'.

Page 129/1623 | < Previous Page | 125 126 127 128 129 130 131 132 133 134 135 136  | Next Page >

  • The meaning of thermal throttle counters and package power limit notifications in Linux

    - by Trustin Lee
    Whenever I do some performance testing on my Linux-installed MacBook Pro, I often see the following messages in dmesg: Aug 8 09:29:31 infinity kernel: [79791.789404] CPU1: Package power limit notification (total events = 40365) Aug 8 09:29:31 infinity kernel: [79791.789408] CPU3: Package power limit notification (total events = 40367) Aug 8 09:29:31 infinity kernel: [79791.789411] CPU2: Package power limit notification (total events = 40453) Aug 8 09:29:31 infinity kernel: [79791.789414] CPU0: Package power limit notification (total events = 40453) I also see the throttle counters in the sysfs increases over time: trustin@infinity:/sys/devices/system/cpu/cpu0/thermal_throttle $ ls core_power_limit_count package_power_limit_count core_throttle_count package_throttle_count $ cat core_power_limit_count 0 $ cat core_throttle_count 41912 $ cat package_power_limit_count 67945 $ cat package_throttle_count 67565 What do these counters mean? Do they affect the performance of CPU or system? Do they result in increased deviation of performance numbers? (i.e. Do they prevent me from getting reliable performance numbers?) If so, how do I avoid these messages and increasing counters? Would running the performance tests on a well-cooled desktop system help?

    Read the article

  • Oracle Database As Seen at Sapphire 2010

    - by jenny.gelhausen
    Seen around the Orange County Convention Center in Orlando Florida at the May 16-19th SAPPHIRE 2010 conference Oracle Database is the #1 Database for SAP applications Oracle Database 11g Release 2 is available for SAP. By upgrading you can lower the cost of your SAP applications infrastructure and improve your quality of service, so we encourage you to consider the upgrade.

    Read the article

  • Web Site Performance and Assembly Versioning

    - by capgpilk
    I originally wanted to write this post in one, but there is quite a large amount of information which can be broken down into different areas, so I am going to publish it in three posts. Minification and Concatination of JavaScript and CSS Files – this post Versioning Combined Files Using Subversion – published shortly Versioning Combined Files Using Mercurial – published shortly Website Performance There are many ways to improve web site performance, two areas are reducing the amount of data that is served up from the web server and reducing the number of files that are requested. Here I will outline the process of minimizing and concatenating your javascript and css files automatically at build time of your visual studio web site/ application. To edit the project file in Visual Studio, you need to first unload it by right clicking the project in Solution Explorer. I prefer to do this in a third party tool such as Notepad++ and save it there forcing VS to reload it each time I make a change as the whole process in Visual Studio can be a bit tedious. Now you have the project file, you will notice that it is an MSBuild project file. I am going to use a fantastic utility from Microsoft called Ajax Minifier. This tool minifies both javascript and css. 1. Import the tasks for AjaxMin choosing the location you installed to. I keep all third party utilities in a Tools directory within my solution structure and source control. This way I know I can get the entire solution from source control without worrying about what other tools I need to get the project to build locally. 1: <Import Project="..\Tools\MicrosoftAjaxMinifier\AjaxMin.tasks" /> 2. Now create ItemGroups for all your js and css files like this. Separating out your non minified files and minified files. This can go in the AfterBuild container. 1: <Target Name="AfterBuild"> 2:  3: <!-- Javascript files that need minimizing --> 4: <ItemGroup> 5: <JSMin Include="Scripts\jqModal.js" /> 6: <JSMin Include="Scripts\jquery.jcarousel.js" /> 7: <JSMin Include="Scripts\shadowbox.js" /> 8: </ItemGroup> 9: <!-- CSS files that need minimizing --> 10: <ItemGroup> 11: <CSSMin Include="Content\Site.css" /> 12: <CSSMin Include="Content\themes\base\jquery-ui.css" /> 13: <CSSMin Include="Content\shadowbox.css" /> 14: </ItemGroup>   1: <!-- Javascript files to combine --> 2: <ItemGroup> 3: <JSCat Include="Scripts\jqModal.min.js" /> 4: <JSCat Include="Scripts\jquery.jcarousel.min.js" /> 5: <JSCat Include="Scripts\shadowbox.min.js" /> 6: </ItemGroup> 7: <!-- CSS files to combine --> 8: <ItemGroup> 9: <CSSCat Include="Content\Site.min.css" /> 10: <CSSCat Include="Content\themes\base\jquery-ui.min.css" /> 11: <CSSCat Include="Content\shadowbox.min.css" /> 12: </ItemGroup>   3. Call AjaxMin to do the crunching. 1: <Message Text="Minimizing JS and CSS Files..." Importance="High" /> 2: <AjaxMin JsSourceFiles="@(JSMin)" JsSourceExtensionPattern="\.js$" 3: JsTargetExtension=".min.js" JsEvalTreatment="MakeImmediateSafe" 4: CssSourceFiles="@(CSSMin)" CssSourceExtensionPattern="\.css$" 5: CssTargetExtension=".min.css" /> This will create the *.min.css and *.min.js files in the same directory the original files were. 4. Now concatenate the minified files into one for javascript and another for css. Here we write out the files with a default file name. In later posts I will cover versioning these files the same as your project assembly again to help performance. 1: <Message Text="Concat JS Files..." Importance="High" /> 2: <ReadLinesFromFile File="%(JSCat.Identity)"> 3: <Output TaskParameter="Lines" ItemName="JSLinesSite" /> 4: </ReadLinesFromFile> 5: <WriteLinestoFile File="Scripts\site-script.combined.min.js" Lines="@(JSLinesSite)" 6: Overwrite="true" /> 7: <Message Text="Concat CSS Files..." Importance="High" /> 8: <ReadLinesFromFile File="%(CSSCat.Identity)"> 9: <Output TaskParameter="Lines" ItemName="CSSLinesSite" /> 10: </ReadLinesFromFile> 11: <WriteLinestoFile File="Content\site-style.combined.min.css" Lines="@(CSSLinesSite)" 12: Overwrite="true" /> 5. Save the project file, if you have Visual Studio open it will ask you to reload the project. You can now run a build and these minified and combined files will be created automatically. 6. Finally reference these minified combined files in your web page. In the next two posts I will cover versioning these files to match your assembly.

    Read the article

  • Supercharging the Performance of Your Front-Office Applications @ OOW'12

    - by Sanjeev Sharma
    You can increase customer satisfaction, brand equity, and ultimately top-line revenue by deploying  Oracle ATG Web Commerce, Oracle WebCenter Sites, Oracle Endeca applications, Oracle’s  Siebel applications, and other front-office applications on Oracle Exalogic, Oracle’s combination  of hardware and software for applications and middleware. Join me (Sanjeev Sharma) and my colleague, Kelly Goetsch, at the following conference session at Oracle Open World to find out how Customer Experience can be transformed with Oracle Exalogic: Session:  CON9421 - Supercharging the Performance of Your Front-Office Applications with Oracle ExalogicDate: Wednesday, 3 Oct, 2012Time: 10:15 am - 11:15 am (PST)Venue: Moscone South (309)

    Read the article

  • Improve Performance of char.IsWhiteSpace for ASCII inputs in .NET 3.5

    - by Tanzim Saqib
    IsNullOrWhiteSpace is a new method introduced in string class in .NET 4.0. While this is a very useful method in string based processing, I attempted to implement it in .NET 3.5 using char.IsWhiteSpace() . I have found significant performance penalty using this method which I replaced later on, with my version. The following code takes about 20.6074219 seconds in my machine whereas my implementation of char.IsWhiteSpace takes about 1/4 less time 15.8271485 seconds only. In many scenarios ex. string...(read more)

    Read the article

  • jMonkey Quest Database

    - by theJollySin
    I am building a game in jMonkey (Java) and I have so far only used default quest text. But now I need to start populating a lot of quests with text. My design requires A LOT of quests texts. What is the best way to build a database of quest texts in jMonkey? I don't have a lot of real experience with databases. Is there a database that integrates well with jMonkey? Here are the ideal properties I want in my database, in order of priority: Reasonably light learning curve Easy portability (in Java) to Windows, Linux, and Mac OSX Good interface with Java Good interface with jMonkey The ability to add properties to the quests: ID, level, gender, quest chain ID, etc. Or am I wrong in thinking I need to use some giant monster like SQL? I haven't been able to find much information on this, so are people using some non-database methods for storing things like quest text in jMonkey?

    Read the article

  • Oracle Database 11g Release 2 for Windows available!

    - by Mike Dietrich
    Hi there, just returned from vacation - and the Easter bunny (was its name Tux??) just delivered the Windows release (32bit and 64bit) of Oracle Database 11g Release 2. It's available for download from edelivery.oracle.com or OTN: Oracle Database 11g Release 2 for Windows 32-bit Oracle Database 11g Release 2 for Windows 64-bit And if you wonder yourself why it took sooooooo long to release Oracle Database 11g Release 2 on the Windows platform: The developers have incorporated a lot of the available fixes on top of 11.2.0.1.0 - so it's more a 11.2.0.1.1/2 ;-) And don't forget to download the newest version of rhe upgrade slides: http://apex.oracle.com/folien Use the keyword (Schluesselwort): upgrade112

    Read the article

  • Downloading large files hangs system

    - by Igor
    When Im trying to download large files, i.e. 1gb or more under FireFox, first of all it starts with very big download speed and in few seconds in almost get up to max (~11 MBps). It is downloading very fast, but when downloaded size becomes near 700-800mb and more, my system almost completely hangs, so I can do nothing - I just have to wait until it finishes downloading. Also when it hangs, I can't see the download progress - it looks like it completely hangs. Sometimes, however, if the file size is near 1gb, the system comes back from hang, finishing download, but sometimes I just cant wait before system comes back and have to kill FF from top (it takes me 2 minutes to do this, because of very slow system performance). I use Firefox as primary browser. If I use wget with direct link to file - everything is fine. Speed at max, no performance decrease. So what can I do?

    Read the article

  • Windows Azure Recipe: High Performance Computing

    - by Clint Edmonson
    One of the most attractive ways to use a cloud platform is for parallel processing. Commonly known as high-performance computing (HPC), this approach relies on executing code on many machines at the same time. On Windows Azure, this means running many role instances simultaneously, all working in parallel to solve some problem. Doing this requires some way to schedule applications, which means distributing their work across these instances. To allow this, Windows Azure provides the HPC Scheduler. This service can work with HPC applications built to use the industry-standard Message Passing Interface (MPI). Software that does finite element analysis, such as car crash simulations, is one example of this type of application, and there are many others. The HPC Scheduler can also be used with so-called embarrassingly parallel applications, such as Monte Carlo simulations. Whatever problem is addressed, the value this component provides is the same: It handles the complex problem of scheduling parallel computing work across many Windows Azure worker role instances. Drivers Elastic compute and storage resources Cost avoidance Solution Here’s a sketch of a solution using our Windows Azure HPC SDK: Ingredients Web Role – this hosts a HPC scheduler web portal to allow web based job submission and management. It also exposes an HTTP web service API to allow other tools (including Visual Studio) to post jobs as well. Worker Role – typically multiple worker roles are enlisted, including at least one head node that schedules jobs to be run among the remaining compute nodes. Database – stores state information about the job queue and resource configuration for the solution. Blobs, Tables, Queues, Caching (optional) – many parallel algorithms persist intermediate and/or permanent data as a result of their processing. These fast, highly reliable, parallelizable storage options are all available to all the jobs being processed. Training Here is a link to online Windows Azure training labs where you can learn more about the individual ingredients described above. (Note: The entire Windows Azure Training Kit can also be downloaded for offline use.) Windows Azure HPC Scheduler (3 labs)  The Windows Azure HPC Scheduler includes modules and features that enable you to launch and manage high-performance computing (HPC) applications and other parallel workloads within a Windows Azure service. The scheduler supports parallel computational tasks such as parametric sweeps, Message Passing Interface (MPI) processes, and service-oriented architecture (SOA) requests across your computing resources in Windows Azure. With the Windows Azure HPC Scheduler SDK, developers can create Windows Azure deployments that support scalable, compute-intensive, parallel applications. See my Windows Azure Resource Guide for more guidance on how to get started, including links web portals, training kits, samples, and blogs related to Windows Azure.

    Read the article

  • Oracle Database 11g Release 2 Now Available on 32-bit and x64 Windows

    - by jenny.gelhausen
    Oracle Database 11g Release 2 provides the foundation for IT to successfully deliver more information with higher quality of service, reduce the risk of change within IT, and make more efficient use of their IT budgets. By deploying Oracle Database 11g Release 2 as their data management foundation, organizations can utilize the full power of the world's leading database. Now Oracle Database 11g Release 2 is available for organizations using 32-bit and x64 Windows. Download either on OTN var gaJsHost = (("https:" == document.location.protocol) ? "https://ssl." : "http://www."); document.write(unescape("%3Cscript src='" + gaJsHost + "google-analytics.com/ga.js' type='text/javascript'%3E%3C/script%3E")); try { var pageTracker = _gat._getTracker("UA-13185312-1"); pageTracker._trackPageview(); } catch(err) {}

    Read the article

  • Restore Gene : Automating SQL Server Database Restores

    Restore Gene is a simple 2-script framework, one PowerShell script and one SQL stored procedure, which will speed up the production of restore scripts for manual disaster recovery, as well help automate log shipping. FREE eBook – "45 Database Performance Tips for Developers"Improve your database performance with 45 tips from SQL Server MVPs and industry experts. Get the eBook here.

    Read the article

  • Will unity stop being a plugin for Compiz?

    - by Murphy1138
    I ask this as with the Unity Desktop running , when I try any games with my Ubuntu 12.04.1 I get so much frame rate drop with Unity and Compiz. If I switch to Gnome-Classic which uses mutter, I get a vast boost in performance. My system is an 8 core AMD with a Nvidia 460 SE that can play anything I chuck at it in Windows and I'm using the latest Nvidia drivers, but even simple games like the humble bundle gets serious lag with Unity and the only cause of this can be compiz (what I can guess). When Steam come to Ubuntu, how will this performance loss be addressed?

    Read the article

  • Larry Ellison Unveils Oracle Database In-Memory

    - by jgelhaus
    A Breakthrough Technology, Which Turns the Promise of Real-Time into a Reality Oracle Database In-Memory delivers leading-edge in-memory performance without the need to restrict functionality or accept compromises, complexity and risk. Deploying Oracle Database In-Memory with virtually any existing Oracle Database compatible application is as easy as flipping a switch--no application changes are required. It is fully integrated with Oracle Database's scale-up, scale-out, storage tiering, availability and security technologies making it the most industrial-strength offering in the industry. Learn More Read the Press Release Get Product Details View the Webcast On-Demand Replay Follow the conversation #DB12c #OracleDBIM

    Read the article

  • Cloud computing - database loading question

    - by workwise
    Following is the situation, I want to know whether what I want is possible in cloud computing and is it the best way for me: 1) My main site has a Database with tables with millions of rows, and entries are added almost every second. 2) I will setup a mysql mirror, so there will be a backup database always in sync with the main one. 3) There are few tens of thousands of images- growing. So say total size of images few tens of gigabytes. I will be keeping the image data also in sync on the backup server. 4) There can be short periods where traffic can go 100X the average traffic. 5) I will be using memcache heavily - most database and even frequently used disk files/images will be in RAM. I want that the main site runs on a dedicated server. The backup server is say an Amazon EC2 instance. Now note that since it is live backup, I need to run a small instance continuously. I want that when I anticipate high traffic, I should be able to run a large instance on the cloud and transfer the traffic there. The main point is - I do not want to spend time in "loading" the database on the large instance, as it typically can take few minutes or even hours (experience). So is it possible to just scale the memory/CPU on demand, and not having to load the database or sync up the filesystem? I want to setup my backup scripts etc just ONCE. Thanks JP

    Read the article

  • High Performance Storage Systems for SQL Server

    Rod Colledge turns his pessimistic mindset to storage systems, and describes the best way to configure the storage systems of SQL Servers for both performance and reliability. Even Rod gets a glint in his eye when he then goes on to describe the dazzling speed of solid-state storage, though he is quick to identify the risks....Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Compelling Reasons For Migrating to Oracle Database 11g

    - by margaret hamburger
    IDC's white paper Maximizing Your Investment in Oracle Application Software: The Case for Migrating to Oracle Database 11g  describes compelling business, operational, and technical reasons for Oracle application customers to upgrade or Oracle Database 11g. In researching this paper, IDC found that the upgrade process is smooth, the latest version offers key benefits over older versions, and the new features and procedures are easy to learn and well worth implementing. The paper highlights users of Oracle applications, such as Oracle's PeopleSoft Enterprise applications, Siebel Customer Relationship Management (CRM) applications, and Oracle E-Business Suite. It includes a review of Oracle Database 11g improvements and new features along with best practices from Oracle users who upgraded to Oracle Database 11g.

    Read the article

  • Lower SAP Apps Infrastructure Cost w/Oracle Database 11g

    - by john.brust
    Register today for this live webcast to learn about the #1 Database for Deploying SAP Applications. Webcast Date: Tuesday, May 11, 2010 at 9:00am PT or your local time. Oracle Database 11g is now available for SAP applications. By upgrading your SAP applications to Oracle Database 11g Release 2 you can significantly reduce infrastructure costs and improve performance, availability, and security at the same time. Our expert guest will be Gerhard Kuppler, Oracle's Director of SAP Alliances.

    Read the article

< Previous Page | 125 126 127 128 129 130 131 132 133 134 135 136  | Next Page >