Search Results

Search found 13151 results on 527 pages for 'performance counters'.

Page 387/527 | < Previous Page | 383 384 385 386 387 388 389 390 391 392 393 394  | Next Page >

  • Is sending data to a server via a script tag an outdated paradigm?

    - by KingOfHypocrites
    I inherited some old javascript code for a website tracker that submits data to the server using a script url: var src = "http://domain.zzz/log/method?value1=x&value2=x" var e = document.createElement('script'); e.src = src; I guess the idea was that cross domain requests didn't haven't to be enabled perhaps. Also it was written back in 2005. I'm not sure how well XmlHttpRequests were supported at the time. Anyone could stick this on their website and send data to our server for logging and it ideally would work in most any browser with javascript. The main limitation is all the server can do is send back javascript code and each request has to wait for a response from the server (in the form of a generic acknowledgement javascript method call) to know it was received, then it sends the next. I can't find anyone doing this online or any metrics as to whether this faster or more secure than XmlHttpRequests. I don't know if this is just an old way of doing things or it's still the best way to send data to the server when you are mostly trying to send data one way and you need the best performance possible. So in summary is sending data via a script tag an outdated paradigm? Should I abandon in favor of using XmlHttpRequsts?

    Read the article

  • Less graphics power all the sudden (Intel HD 3000)

    - by queueoverflow
    I have a Intel Sandy Bridge i5 with the HD 3000 graphics card. I used to be able to play Urban Terror and Nexuiz comfortably with 85 and 60 frames per seconds until mid/end of October 2012, the former even on a full HD display with that many frames. Now I have around 30 to 45 on the smaller laptop screen and around 20 to 30 on the external monitor. Did something happen to Kubuntu 12.04 so that it has less graphics performance than previously? Update I looked into the system monitor and could not detect anything being at the maximum. The four CPU cores were pretty much bored, the 8 GB RAM were filled with maybe 2 GB. And I ran intel_cpu_top and did not notice anything at its limit. See the output. after Kernel bisecting I now did a kernel bisect and tried 3.2.0-23, 3.2.0-27, 3.2.0-29 and 3.2.0-30 and all had full graphics power. Interestingly, I then had full power when I just booted back into the regular 3.2.0-32 kernel. This does not make sense to me …

    Read the article

  • OBIEE 11.1.1.6.2 BP1 patchset is released

    - by THE
    OBIEE 11.1.1.6.2 BP1 is available on My Oracle Support.The Oracle Business Intelligence 11.1.1.6.2 BP1 patchset comprises of the below patches: 14223977 Patch 11.1.1.6.2 BP1 (1 of 7) Oracle Business Intelligence Installer 14226980 Patch 11.1.1.6.2 BP1 (2 of 7) Oracle Real Time Decisions 13960955 Patch 11.1.1.6.2 BP1 (3 of 7) Oracle Business Intelligence Publisher 14226993 Patch 11.1.1.6.2 BP1 (4 of 7) Oracle Business Intelligence ADF Components 14228505 Patch 11.1.1.6.2 BP1 (5 of 7) Enterprise Performance Management Components Installed from BI Installer 11.1.1.6.x 13867143 Patch 11.1.1.6.2 BP1 (6 of 7) Oracle Business Intelligence 14142868 Patch 11.1.1.6.2 BP1 (7 of 7) Oracle Business Intelligence Platform Client Installers and MapViewer Note•    The Readme files for the above patches describe the bugs fixed in each patch, and any known bugs with the patch. •    The instructions to apply the above patches are identical, and are contained in the Readme file for Patch 14223977. To obtain the Patchset, navigate to  http://support.oracle.com  , select the "Patches & Updates" Tab and type in the patch number.

    Read the article

  • Would this data requirement suit a Document -Oriented database?

    - by codecowboy
    I have a requirement to allow users to fill in journal/diary entries per day. I want to provide a handful of known journal templates with x columns to fill in. An example might be a thought diary; a user has to record a thought in one column, describe the situation, rate how they felt etc. The other requirement is that a user should be able to create their own diary templates. They might have a need for a 10 column diary entry per day and might need to rate some aspect out of 50 instead of 10. In an RDBMS, I can see this getting quite complicated. I could have individual tables for my known templates as the fields will be fixed. But for custom diary templates I imagine I would would need a table storing custom_field_types (the diary columns), a table storing entries referencing their field types (custom_entries) and then a third custom_diary table which would store rows matching custom_entries to diaries. Leaving performance / scaling aside, would it be any simpler or make more sense to use a document oriented database like MongoDB to store this data? This is for a web application which might later need an API for mobile devices.

    Read the article

  • New Workstation &ndash; Lenovo W530 Core i7 32GB 256GB SSD Win8Pro

    - by Brian Lanham
    So I pretty-much have my new machine up and running full-time. I am still going to have to hit my old workstation for some things but am more-or-less working on my new machine.  It’s really fast. And Bret was right, I’m not so far using all the RAM. 16 would have been enough but as @CodeMonkeyJava “go big or go home”. Windows 8 is…interesting.  So far I still seem to do most of my work in the “Desktop”.  However, I like the Store concept and I like the Metro UX.  Live tiles are also nice.  I really like how I can switch between Desktop and Metro easily.  Overall I think Microsoft has done a great job of combining the needed experience for touch and mouse. My overall Windows 8 rating is 5.9 because of the video card. Otherwise I’m hitting 7.8.  The system boots from cold in about 11 seconds and performs complete shutdown in 4.7 seconds.  It wakes from sleep in less than 1 second. VS 2012 starts and restarts almost instantly.  In fact, I find myself staring at the start page without realizing it.  Build time doesn’t seem to be significantly increased but it is faster. I seem to already be reinvigorated for work with this new machine. I’m looking forward to the performance.

    Read the article

  • Create Math Game with PHP, Ajax, Jquery

    - by Sambucasun
    I am developing a website where user can create their own game which can be joined by other users as well. It's a simple maths game which will shoot equations based on time or count specified. I want that moment user create a game, it will be listed in "current Games" section. Other users can check out the list and select the game to join. After game is created, creater should have a screen which should be having his name with display pic. Now gradually as others start joining the game, list should updated automatically. Once enough users are there i will start the game. The same list should be displayed to other users who join the game. Once game is over all will be displayed a summary list. I have gone through couple of threads but could not get clear idea. Do I need to use comet or other technology to create such game or simple PHP, Ajax or Jquery will suffice ? Also I want my website should be mobile compatible so i am designing it in html5. If i create this game using just Ajax then will there be any performance issue while playing through mobile. I am not much experienced so just need guidance for what should be appropriate or use for my requirement.

    Read the article

  • Spend Analytics on a Grand Scale

    - by jacqueline.coolidge(at)oracle.com
    The Wall St. Journal reports in Billions in Bloat Uncovered in Beltway that a recent study by Government Accountability Office (GAO) has released a massive study of several programs and agencies that cost U.S. taxpayers billions of dollars each year.  This report help save $100 to $200 Billion dollars by identifying duplicate spending and ineffective programs that can be consolidated or eliminated. Now, that is spend analytics on a massive scale! It remains to be seen how actionable that information will be.  Certainly, there have been studies before that identify wasteful spending.  But, it’s a great case of the power of business intelligence and spend analytics.   Many companies do find significant savings when they implement spend and procurement analytics. What makes for an excellent spend analysis? It should be: Objective and provide visibility across programs and/or divisions A cross functional analysis that links financial with performance metrics Prescriptive and actionable Spend and procurement analytics have been HOT during the economic downturn! I expect 2011 will see many more companies get serious about spend analytics and would love to hear from companies who are willing to share their experience.

    Read the article

  • Sprite batching in OpenGL

    - by Roy T.
    I've got a JAVA based game with an OpenGL rendering front that is drawing a large amount of sprites every frame (during testing it peaked at 700). Now this game is completely unoptimized. There is no spatial partitioning (so a sprite is drawn even if it isn't on screen) and every sprite is drawn separately like this: graphics.glPushMatrix(); { graphics.glTranslated(x, y, 0.0); graphics.glRotated(degrees, 0, 0, 1); graphics.glBegin(GL2.GL_QUADS); graphics.glTexCoord2f (1.0f, 0.0f); graphics.glVertex2d(half_size , half_size); // upper right // same for upper left, lower left, lower right graphics.glEnd(); } graphics.glPopMatrix(); Currently the game is running at +-25FPS and is CPU bound. I would like to improve performance by adding spatial partitioning (which I know how to do) and sprite batching. Not drawing sprites that aren't on screen will help a lot, however since players can zoom out it won't help enough, hence the need for batching. However sprite batching in OpenGL is a bit of mystery to me. I usually work with XNA where a few classes to do this are built in. But in OpenGL I don't know what to do. As for further optimization, the game I'm working on as a few interesting characteristics. A lot of sprites have the same texture and all the sprites are square. Maybe these characteristics will help determine an efficient batching technique?

    Read the article

  • Simplifying data search using .NET

    - by Peter
    An example on the asp.net site has an example of using Linq to create a search feature on a Music album site using MVC. The code looks like this - public ActionResult Index(string movieGenre, string searchString) { var GenreLst = new List<string>(); var GenreQry = from d in db.Movies orderby d.Genre select d.Genre; GenreLst.AddRange(GenreQry.Distinct()); ViewBag.movieGenre = new SelectList(GenreLst); var movies = from m in db.Movies select m; if (!String.IsNullOrEmpty(searchString)) { movies = movies.Where(s => s.Title.Contains(searchString)); } if (!string.IsNullOrEmpty(movieGenre)) { movies = movies.Where(x => x.Genre == movieGenre); } return View(movies); } I have seen similar examples in other tutorials and I have tried them in a real-world business app that I develop/maintain. In practice this pattern doesn't seem to scale well because as the search criteria expands I keep adding more and more conditions which looks and feels unpleasant/repetitive. How can I refactor this pattern? One idea I have is to create a column in every table that is "searchable" which could be a computed column that concatenates all the data from the different columns (SQL Server 2008). So instead of having movie genre and title it would be something like. if (!String.IsNullOrEmpty(searchString)) { movies = movies.Where(s => s.SearchColumn.Contains(searchString)); } What are the performance/design/architecture implications of doing this? I have also tried using procedures that use dynamic queries but then I have just moved the ugliness to the database. E.g. CREATE PROCEDURE [dbo].[search_music] @title as varchar(50), @genre as varchar(50) AS -- set the variables to null if they are empty IF @title = '' SET @title = null IF @genre = '' SET @genre = null SELECT m.* FROM view_Music as m WHERE (title = @title OR @title IS NULL) AND (genre LIKE '%' + @genre + '%' OR @genre IS NULL) ORDER BY Id desc OPTION (RECOMPILE) Any suggestions? Tips?

    Read the article

  • What's the best way to use requestAnimationFrame and fixed frame rates

    - by m90
    I recently got into using the HTML5-requestAnimationFrame-API a lot on animation-heavy websites, especially after seeing the Jank Busters talk. This seems to work pretty well and really improve performance in many cases. Yet one question still persists for me: When wanting to use an animation that is NOT entirely calculated (think spritesheets for example) you will have to aim for a fixed frame rate. Of course one could go back to use setInterval again, but maybe there are other ways to tackle this. The two ways I could think of using requestAnimationFrame with a fixed frame rate are: var fps = 25; //frames per second function animate(){ //actual drawing goes here setTimeout(function(){ requestAnimationFrame(animate); }, 1000 / fps) } animate(); or var fps = 25; //frames per second var lastExecution = new Date().getTime(); function animate(){ var now = new Date().getTime(); if ((now - lastExecution) > (1000 / fps)){ //do actual drawing lastExecution = new Date().getTime(); } requestAnimationFrame(animate); } animate(); Personally, I'd opt for the second option (the first one feels like cheating), yet it seems to be more buggy in certain situations. Is this approach really worth it (especially at low frame rates like 12.5)? Are there things to be improved? Is there another way to tackle this?

    Read the article

  • So You Want To Build a SPARC Cloud

    - by user12601629
    Did you ever wish you could get the industrial strength power of UNIX/RISC with the flexibility of cloud computing?  Well, now you can!  With recent advances from Oracle it's possible to build an incredibly high-performance, flexible, available virtualized infrastructure based on Solaris and SPARC.  Here's the recipe! Authored in collaboration across the Oracle "Systems Group" team, we now have a complete best practice guide for you.  Click below to download it: Best Practices for Building a Virtualized SPARC Computing Environment Inside you'll find recommendations for how and when to leverage technologies like: SPARC T4 OVM for SPARC hypervisor (version 2.2 and newer) Solaris 11 Ops Center 12c ZFS Storage Appliance Oracle network switches Based on following these best practices, you'll be able to construct a dynamic, virtualized infrastructure that allows for: Easy, GUI-based provisioning on new VMs Automated HA failover in the event of physical server failures Automatic load balancing across a cluster of VM hosts Complete end-to-end monitoring You should download this paper and check it out.  Even if you aren't planning on buying all new hardware, and instead want to transform some existing gear into a dynamic virtualized environment then this paper will give you concrete info on what to do and the trade-offs you'll make. Have fun getting started on your journey to build a SPARC cloud!

    Read the article

  • AngularJS dealing with large data sets (Strategy)

    - by Brian
    I am working on developing a personal temperature logging viewer based on my rasppi curl'ing data into my web server's api. Temperatures are taken every 2 seconds and I can have several temperature sensors posting data. Needless to say I will have a lot of data to handle even within the scope of an hour. I have implemented a very simple paging api from the server so the server doesn't timeout and is currently only returning data in 1000 units per call, then paging through the data. I had the idea to intially show say the last 20 minutes of data from a sensor (or all sensors depending on user choices), then allowing the user to select other timeframes from which to show data. The issue comes in when you want to view all sensors or an extended time period (say 24 hours). Is there a best practice of handling this large amount of data? Would it be useful to load those first 20 minutes into the live view and then cache into local storage something like the last 24 hours? I haven't been able to find a decent idea of this in use yet even though there are a lot of ways to take this problem. I am just looking for some suggestions as to what might provide a good balance between good performance and not caching the entire data set on the client side (as beyond a week of data this might not be feasible).

    Read the article

  • Efficient algorithm for Virtual Machine(VM) Consolidation in Cloud

    - by devansh dalal
    PROBLEM: We have N physical machines(PMs) each with ram Ri, cpu Ci and a set of currently scheduled VMs each with ram requirement ri and ci respectively Moving(Migrating) any VM from one PM to other has a cost associated which depends on its ram ri. A PM with no VMs is shut down to save power. Our target is to minimize the weighted sum of (N,migration cost) by migrating some VMs i.e. minimize the number of working PMs as well as not to degrade the service level due to excessive migrations. My Approach: Brute Force approach is choosing the minimum loaded PM and try to fit its VMs to other PMs by First Fit Decreasing algorithm or we can select the victim PMs and target PMs based on their loading level and shut down victims if possible by moving their VMs to targets. I tried this Greedy approach on the Data of Baadal(IIT-D cloud) but It isn't giving promising results. I have also tried to study the Ant colony optimization for dynamic VM consolidating but was unable to understand very much. I used the links. http://dumas.ccsd.cnrs.fr/docs/00/72/52/15/PDF/Esnault.pdf http://hal.archives-ouvertes.fr/docs/00/72/38/56/PDF/RR-8032.pdf Would anyone please clarify the solution or suggest any new approach/resources for better performance. I am basically searching for the algorithms not the physical optimizations and I also know that many commercial organizations have provided these solution but I just wanted to know more the underlying algorithms. Thanks in advance.

    Read the article

  • Best method to implement a filtered search

    - by j0N45
    I would like to ask you, your opinion when it comes to implement a filtered search form. Let's imagine the following case: 1 Big table with lots of columns It might be important to say that this SQL Server You need to implement a form to search data in this table, and in this form you'll have several check boxes that allow you to costumize this search. Now my question here is which one of the following should be the best way to implement the search? Create a stored procedure with a query inside. This stored procedure will check if the parameters are given by the application and in the case they are not given a wildcard will be putted in the query. Create a dynamic query, that is built accordingly to what is given by the application. I am asking this because I know that SQL Server creates an execution plan when the stored procedure is created, in order to optimize its performance, however by creating a dynamic query inside of the stored procedure will we sacrifice the optimization gained by the execution plan? Please tell me what would be the best approach in your oppinion.

    Read the article

  • How "commercially savvy" should software developers be? [closed]

    - by mattnz
    I have been watching answers to many questions on this site, and have come to the conclusion that commercial pragmatism does not factor into many software development discussions. As a result, I seriously wonder at the commercial skills within the industry, specifically the ability to deliver projects on time and to a budget. I see no indication from the site that commercially successful project delivery is a serious concern, yet the industry has a reputation for poor performance in this. Rarely, if ever, does the cost of time factor into discussions. I have never seen concepts such as opportunity cost, time to market, competitive advantage or cash flow mentioned, let alone discussed in technical answers to questions. How can you answer virtually any question without understanding the commercial background on which it is asked? Even Open source projects have a need to operate efficiently and deploy their limited resources to providing the most value for effort. Typically small start-ups have cash flow issues that outweigh longevity concerns, yet they are typically still advised to build for a future they probably won’t have if they do. Is it fair to say that these problems are solely the Managers and Project managers to solve, or are we, as developers, also responsible for ensuring successful on time, within budget delivery of projects, even if those budgets do not allow use to achieve engineering excellence?

    Read the article

  • Erfolgreich mit Referenzen

    - by A&C Redaktion
    Erfolgreich umgesetzte Projekte sind die beste Werbung! So wie zum Beispiel der Anwenderbericht mit der DAW-Gruppe, Hersteller der bekannten Marke "Alpinaweiß" sowie vieler anderer Farben, und Oracle Partner Freudenberg IT. Lesen Sie, wie DAW mithilfe der Oracle Database Enterprise Edition 11g und Oracle Advanced Compression die Speicherleistung um 40% erhöhen und bis zu 20% mehr Datenbank-Performance erzielen konnte.  Sie möchten auch einen Anwenderbericht zu einem abgeschlossenen Kundenprojekte haben? So gehts: Schicken Sie einfach ein paar Eckdaten zum Projekt, vorausgesetzt Ihr Endkunde ist einverstanden, an Frau Marion Aschenbrenner, alles weitere übernimmt Oracle - einschließlich der Finanzierung! Folgende Details sollten Sie uns nennen: Kundenname & Ansprechpartner (mit Telefon und Email-Adresse) Eingesetzte Oracle Produkte Kurze Beschreibung des Projektes (3-4 Sätze mit dem Hauptnutzen der Oracle Lösung) Ansprechpartner bei Ihnen im Hause für das Kundenprojekt und alles, was noch wichtig zu wissen ist (Go-Live-Datum, Terminvorschläge, etc.) Nach einer erfolgreichen Prüfung Ihrer Nominierung werden Sie und der Endkunde per Telefon interviewt und ein ausführlicher Anwenderbericht erstellt. Diesen bekommen Sie und ihr Kunde natürlich zur Freigabe vorgelegt. Veröffentlicht werden die Referenzberichte unter anderem hier im Blog und in anderen sozialen Medien.

    Read the article

  • Erfolgreich mit Referenzen

    - by A&C Redaktion
    Erfolgreich umgesetzte Projekte sind die beste Werbung! So wie zum Beispiel der Anwenderbericht mit der DAW-Gruppe, Hersteller der bekannten Marke "Alpinaweiß" sowie vieler anderer Farben, und Oracle Partner Freudenberg IT. Lesen Sie, wie DAW mithilfe der Oracle Database Enterprise Edition 11g und Oracle Advanced Compression die Speicherleistung um 40% erhöhen und bis zu 20% mehr Datenbank-Performance erzielen konnte.  Sie möchten auch einen Anwenderbericht zu einem abgeschlossenen Kundenprojekte haben? So gehts: Schicken Sie einfach ein paar Eckdaten zum Projekt, vorausgesetzt Ihr Endkunde ist einverstanden, an Frau Marion Aschenbrenner, alles weitere übernimmt Oracle - einschließlich der Finanzierung! Folgende Details sollten Sie uns nennen: Kundenname & Ansprechpartner (mit Telefon und Email-Adresse) Eingesetzte Oracle Produkte Kurze Beschreibung des Projektes (3-4 Sätze mit dem Hauptnutzen der Oracle Lösung) Ansprechpartner bei Ihnen im Hause für das Kundenprojekt und alles, was noch wichtig zu wissen ist (Go-Live-Datum, Terminvorschläge, etc.) Nach einer erfolgreichen Prüfung Ihrer Nominierung werden Sie und der Endkunde per Telefon interviewt und ein ausführlicher Anwenderbericht erstellt. Diesen bekommen Sie und ihr Kunde natürlich zur Freigabe vorgelegt. Veröffentlicht werden die Referenzberichte unter anderem hier im Blog und in anderen sozialen Medien.

    Read the article

  • A Myriad of Options

    - by Mark Hesse
    I am currently working with a customer that is close to outgrowing their Exadata X2-2 half rack in both compute and storage capacity.  The platform is used for one of their larger data warehouse applications and the move to Exadata almost two years ago has been a resounding success, forcing them to grow the platform sooner than anticipated. At a recent planning meeting, we started looking at the options for expansion and have developed five alternatives, all of which meet or exceed their growth requirements, yet have different pros and cons in terms of the impact to their production and test environments. The options include an in-rack upgrade to a full rack of Exadata using the recently released X3-2 platform (an option that even applies to an older V2 rack), multi-rack cabling the existing X2-2 to another full rack or half rack X2-2 (and utilizing both compute and storage capacity in the other rack), or simply adding a new X3-2 half rack (and taking advantage of the added compute and flash performance in the X3-2). While the decision is yet to be made, it had me thinking that one of the benefits of Exadata over a traditional database deployment is that when the time comes to expand the platform, there are a myriad of options.

    Read the article

  • Scala - learning by doing

    - by wrecked
    coming from the PHP-framework symfony (with Apache/MySQL) I would like to dive into the Scala programming language. I already followed a few tutorials and had a look at Lift and Play. Java isn't a stranger to me either. However the past showed that it's easiest to learn things by just doing them. Currently we have a little - mostly ajax-driven - application build on symfony at my company. My idea is to just build a little project similar to this one which might gets into production in the future. The app should feature: High scalability and performance a backend-server web-interface and a GUI-client There are plenty of questions arising when I think of this. First of all: Whats the best way to accomplish a easy to maintain, structured base for it? Would it be best to establish a socket based communication between pure-scala server/client and accessing that data via Lift or is building a Lift-app serving as a server and connecting the gui-client (via REST?) the better way? Furthermore I wounder which database to choose. I'm familiar with (My)SQL but feel like a fool beeing confronted with all these things like NoSQL, MongoDB and more. Thanks in advance!

    Read the article

  • Acer aspire v3 771G ubuntu 13.04

    - by Jos
    Gooday, i have this acer and i have alot of boot problems (i suspect windows 8) and now i want to try ubuntu but when i use an usb to "try" ubuntu after the boot i get a black screen. now ive read some of the forums and i found something about NOMODESET i have not tried this as i dont know what this does exactly. now i have found this wiki entry https://wiki.ubuntu.com/Bumblebee , i am by far no programmer and always reading all those commands have always kept me of linux because im scared i will !@#$ things up. is there anyway i can go to NOMODESET in the ubuntu "trial" and can i also include the bumblebee futures (coding?) and in how many ways wil this affect my laptops perfomance? reading the bumblebee entry its seems to be something about nvidia optimus and i dont reallt care much for the power saving, but will it affect any performance? im not a heavy pc gamer but i like tho do some gaming and streaming and such also on a rather big TV in wich this laptop already has it flaws in some games not running properly on 65" if this doesn't work or u advise me not to do this what else can i do to fix windows 8 or either some other linux version? i thankyou in advance

    Read the article

  • Why does my flash video get stuck? (not lag?)

    - by Sk606
    After moving to Ubuntu 10.10 Netbook Edition, I've noticed a strange problem that manifests itself most obviously with streaming videos like youtube and hulu. The video will load and begin playing fine and eventually stop, as though it is buffering. However, it usually happens between 40 and 60 seconds into the clip. The loading indication however shows that it actually isn't buffering anymore of the data, and will just sit there. If I force the clip forward (e.g. click to start the movie somewhere ahead of the stall point) the clip will usually reload quickly and begin playing as expected. Because of the quick load times, and the consistent stop of video, I don't think the problem is related to hardware performance or network throughput. Also, I have none of these problems in 10.04. Rebooting into 10.04 makes the problem disappear. However, one more clue, I also have noticed a higher occurrence of web pages that simply "stop loading" - say, 1/10 clicks while browsing I have to hit "reload" to get the page to load because the page timed out part of the way through. This seems strangely related to the video problem, though is a lot less infuriating. Help! Where do I begin to look with a problem like this?

    Read the article

  • Problems with ATI propietary driver in Ubuntu 12.10, window trail and screen does not update

    - by crystallero
    I have problems with the Ati propietary driver (fglrx). I have an iMac (mid 2011) with a Radeon HD 6900M [1002:6720]. I did not have any problem under Ubuntu 12.04, but since I updated to 12.10, I get some annoying graphic corruption. The worst one is that sometimes the screen does not update with the new information. It happens a lot when I change between tabs in Chrome or Sublime Text. It usually gets updated when I scroll the page. Sometimes, when I type, I have to wait a little bit to view the new characters. And I get trails when I move windows too (like a part of the window). After a while, the trail disappears. I tried to install fglrx, fglrx-updates and the new beta driver downloaded from Ati (12.11 Beta 11/16/2012), with no luck. It happens the same with all of them. I tried to mess with Compiz config, but it didn't fix anything. The open source driver does not suffer this problem, but I need the performance of the propietary driver . Do you have clue? Thanks.

    Read the article

  • Database Insider - December 2012 issue

    - by Javier Puerta
    The December issue of the Database Insider newsletter is now available. (Full newsletter here) Big Data: From Acquisition to Analysis 2012 will likely be remembered as the year of big data, as a new generation of technologies enables organizations to acquire, organize, and analyze the exponentially growing and typically less-structured data generated from a variety of new sources. Oracle has produced a series of five short videos that offer a quick and compelling high-level introduction to big data. Read More Total Cost of Ownership Comparison: Oracle Exadata vs. IBM P-Series Read the research that found that over three years, the IBM hardware running Oracle Database cost 31 percent more in total cost of ownership than Oracle Exadata. Webcast - Oracle Exadata Database Machine X3 Learn about Oracle’s next-generation database machine, Oracle Exadata X3, that combines massive memory and low-cost disks to deliver the highest performance at the lowest cost. Available in an eight-rack configuration, it allows you to start small and grow.    Maximum Availability with Oracle GoldenGate Discover how to eliminate not only unplanned downtime but also planned downtime resulting from database upgrades, migrations, and consolidation.Thursday, December 1319:00 CET / 6 pm. UK   

    Read the article

  • Appcache and jquery mobile on a CMS powered site?

    - by user793011
    Has anyone used the cache manifest to make a CMS site work offline? I've made a demo with static html files which seems to work fine, so I'm assuming it wouldn't be too hard to achieve the same thing with a CMS. The way that you tell browsers that files have changed (and so need to be downloaded again) is by adding a comment to the cache manifest file so its byte size changes. I'm not quite sure how to do this with a CMS, but maybe some sort of server cron could run periodically? Personally I'm more interested in having a site that works offline rather than achieving ideal performance, so if the file was modified every hour rather than when content actually changed that would be fine for me. If anyone has used appcache with a CMS, has anyone done so with jquery mobile at the same time? What I'm after is a fully native feel to a site that's accessible offline, in other words I want to mimic a native App. My static demo does this perfectly with jquery mobile, so again I would have thought this would be achievable in a CMS.

    Read the article

  • Extremely slow desktop and laggy. Need help with graphics driver

    - by user171624
    I am a fresh newbie Ubuntu user and I just installed my first Ubuntu 13.04 onto my HP Slate 2. I did a liveCD on my USB drive and installed everything perfectly fine...nice and smooth, not a trace of lag. Then I rebooted using Ubuntu itself on the computer, it was extremely slow and laggy. Icons or any buttons doesn't trigger right away, the performance of the entire thing looks like either 0.25 fps to 1 fps. My HP Slate 2 information: Processor: Intel Atom Z670 1.5Ghz Memory Ram: 2.0 GB Videocard: Intel GMA 600 (PowerVr SGX535) SolidStateDrive(SSD): 32GB I tried installing the intel linux graphics driver and it failed to install because it said I don't have any intel based graphics card. Well...I do as you see above. What can I do? I can't get on the internet on it, I'm using my primary computer (Windows 7) to do all the searchings and put the files onto the USB to move it over to my tablet. Simply...I don't get it...using liveCD on USB, it was all nice and smooth...then after the installation...BOOM! Slow, laggy, and etc. Can anyone help me? Thanks!

    Read the article

< Previous Page | 383 384 385 386 387 388 389 390 391 392 393 394  | Next Page >