Search Results

Search found 5887 results on 236 pages for 'cpu temperature'.

Page 166/236 | < Previous Page | 162 163 164 165 166 167 168 169 170 171 172 173  | Next Page >

  • How to build the mainline kernel source package?

    - by Maxime R.
    Ubuntu kernel PPA only provides linux-headers*.deb and linux-image*.deb packages. How can I build the corresponding linux-source*.deb package ? Context: I'm currently running Ubuntu 11.10 with the mainline kernel (3.2 rc6 now) to get a better support for my sandybridge IGP (Dell E6420 laptop with intel i5-2520M CPU). Appears, i'd like to install this touchpad driver, ALPS touchpads being badly supported (see previous link bug report), while waiting for upstream support in kernel version 3.3. Problem is, DKMS keeps complaining about not finding the full kernel source: Module build for the currently running kernel was skipped since the kernel source for this kernel does not seem to be installed. Appears I may not need the full source but I'd still like to try having it installed to see if it solve my problem. What I tried : Uncompressing the kernel.org source archive in /usr/src/. DKMS still complaining. Manually updating the kernel source package with uupdate and the mainline source package like explained here. Did not succeed. Manually building the linux-source package following @roadmr and @elmicha instructions. I eventually succeeded to build it but DKMS still complained about the missing source. At last I noticed an error I did not catch in the first place while reinstalling the kernel headers. Appears the .deb I got may have been corrupted, downloading it again did the trick :) Alas, while DKMS agreed to compile the module i ran into the following error which appears to have already been reported. This issue isn't yet solved but I won't try to because of the following: in the end I decided to test the precise kernel version 3.2-rc6 through the xorg-edgers ppa which appears to be correctly patched: it works. Nevertheless, it might still be of some interest to know how to build the mainline linux-source package as the Ubuntu Kernel Team doesn't provide it. Not to mention that I learned a lot in the process ^^

    Read the article

  • Accessing second hard drive

    - by Jonathan
    Hi, So I recently installed Ubuntu 10.10 64-bit on my computer. I installed it on my 60gb SSD hard drive, and in the installation it never acknowledged the existence of my second hard drive. The hard drive that I keep all my files on, and which I want to make my home folder if I can, is a Western Digital Caviar Black 1TB SATA 6Gb/s 64MB cache (WD1002FAEX). I've read the following: https://help.ubuntu.com/community/Mount but honestly cannot work out how to access the hard drive from my Ubuntu installation. I did have Windows 7 64-bit prior to installing Ubuntu. I have backed up all the files on the hard drive, but if I could just access them straight off that would be super cool. Does anyone know how I can use the second hard drive? Thank you for your help EDIT: The following directories are currently in my /dev/ folder: ati/, block/, bsg/, bus/, char/, cpu/, isk/, input/, mapper/, net/, pktcdvd/, pts/, shm/, snd/, and usb/ EDIT: Result from sudo fdisk -l Disk /dev/sda: 60.0 GB, 60022480896 bytes 255 heads, 63 sectors/track, 7297 cylinders Units = cylinders of 16065 * 512 = 8225280 bytes Sector size (logical/physical): 512 bytes / 512 bytes I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0x000d2dfd Device Boot Start End Blocks Id System /dev/sda1 * 1 6994 56174592 83 Linux /dev/sda2 6994 7298 2438145 5 Extended /dev/sda5 6994 7298 2438144 82 Linux swap / Solaris

    Read the article

  • Using EC2 instance as main development platform

    - by David
    My problem I am working as a consultant for various companies. Each company provides me with a laptop where with their software on and I also have my own where I have my development environment. I tend to buy a new laptop every second year and find myself spending lots of time configuring and installing software. I also sometimes spend a lot of time waiting for my laptop to process things. To solve all these issues, I am now considering using EC2 (running windows instances) as my main development platform and just access this from any PC I happen to be at. I calculated that running the High-CPU On-Demand Instances (medium) for 8 hours a day for a year costs me 580$, which is acceptable. I imagine that when I approach the workplace each day, I will make a single click my phone to fire up the instance, so it is ready when I get to work. I should have different icons on my phone to fire up the various instance types. The same software should of course automatically be loaded on the various hardware (sometimes I would even need their instance with 68.4 GB of memory). Another advantage is that if I am having a specific problem with my instance, I could fire up another instance and have someone look into the problem and update the image. My question: Does anyone have experience with such a setup on EC2? What kind of problems do you forsee?

    Read the article

  • How to reduce the fan noise and how to increase battery life in ubuntu 11.10/12.04?

    - by mehdi
    I have a brand new Sony Vaio S series laptop.(VPCSA2DGX) It came factory installed with Windows 7 professional Edition 64bit. Runs Intel core i5, 500 GB HDD , 4GB Ram. First I installed ubuntu 11.10 64 bit along side Windows to dual boot. Later,since the problem did not solve, I installed ubuntu 12.04 64bit along side Windows to dual boot. However the problem keeps annoying me. Problem: When running ubuntu 11.10/12.04, the battery lasts only about 1.5 hours. The Fan runs loud and continuously. And there is a lot of heat generated. System monitor shows less than 5% CPU used. My laptop enjoys hybrid graphic and I tried turning off ADM graphic card and keep Intel graphic card on. However I can not get the Fan noise or heat to go away and consequently the battery drain continues. BTW, in windows, the laptop gives 4-5 hours of battery power, Fan is silent and there is no heat problem. Any ideas on how to reduce the fan noise and how to increase battery life in ubuntu 11.10/12.04?

    Read the article

  • How can I tell GoogleBot that a subdirectory is now a subdomain?

    - by cwd
    I had about a million pages of a catalog indexed under a subdirectory, and now that's moved to a subdomain. GoogleBot is crawling each one of them and getting a 301 redirect to the new location. Even though I have set up the redirect rule in the apache sites-enabled configuration file, (i.e. it's early on when apache does the redirect - PHP is not even getting loaded), even though I have done that, the server isn't handling the load well. GoogleBot is making around 5 requests per second, and on top of my normal traffic that is hiking up the CPU for a few hours at a time. I checked in Webmaster Tools and the corresponding documentation for a way to let Google know that the content had been moved from a subdirectory to a subdomain, but with little luck. Basically the most helpful thing I saw said to just send 301 headers for the new location. How can I tell GoogleBot that a subdirectory is now a subdomain? If that is not an option, how can I more efficiently send 301 redirects out for a particular subdomain? I was thinking perhaps the Nginx server but I'm not sure that I can run both Apache and Nginx side by side on port 80 for different subdomains.

    Read the article

  • Retrofit Certification

    - by Bill Evjen
    Impact of Regulations on Cabin Systems Installation John Courtright, Structural Integrity Engineering There are “heightened” FAA attention to technical issues related to IFE and Wi-Fi Systems Installations The Aging Aircraft Safety Rule – EWIS & Damage Tolerance Analysis The Challenge: Maximize Flight Safety While Minimizing Costs Issue Papers & Testing, Testing, Testing The role of Airworthiness Directives (ADs) on the design of many IFE systems and all antenna systems. Goal is safety AND cost-effective maintenance intervals and inspection techniques The STC Process Briefly Stated Type Certifications (TC) Supplemental Type Certifications (STC) The STC Process Project Specific Certification Plan (PSCP) Managed by FAA Aircraft Certification Office (ACO) Type of Project (Electrical/Mechanical Systems or Structural) Specific Type of Aircraft Being Modified Schedule Design & Installation Location What does the STC Plan (PSCP) Cover? System Description – What does the system do? System qualification – Are the components qualified? Certification requirements – What FARs are applicable? Installation detail – what is being modified? Prototype installation – What is new? Functional hazard Assessment (FHA) – is it safe? EZAP-EWIS Requirements – Any aging aircraft issues? Certification Data – How is compliance achieved? Delegation and FAA involvement – Who is doing the work? Proposed certification schedule – When is the installation? Certification documentation – What the FAA Expects to see Cabin Systems Certification Concerns In addition to meeting the requirements for DO-160, Cabin System Certification needs to address issues related to: Power management: Generally, IFE and Wi-Fi Systems are classified as “Non-Essential Equipment” from a certification viewpoint. Connected to “non-essential” power buses Must be able to shed IFE & Wi-Fi Systems in a smoke/fire event or Other electrical emergency (FAA Policy 00-111-160) FAA is more relaxed with testing wi-fi. It used to be that you had to have 150 seats with laptops running wi-fi, but now it is down to around 50. Aging aircraft concerns – electrical and structural Issue papers addressing technical concerns involving: “Structural Certification Criteria for Large Antenna Installations” Antenna “Vibration/Buffeting Compliance Criteria” DO-160 : Environmental Test Procedures DO 160 – “Environmental Conditions and Test Procedures for Airborne Equipment”, Issued by RTCA Provides guidance to equipment manufacturers as to testing requirements Temperature: –40C to +55C Vibration and Shock Contaminant susceptibility – fluids and dust Electro-magnetic Interference Cabin systems are generally classified as “non-essential” Swissair 111 crashed (in part) due to non-standard wiring practices. EWIS Design Implications Installation design must take EWIS Requirements into account. This generally means: Aircraft surveys are needed to identify proper wire routing Ensure existing wiring diagrams are correct Identify primary/Secondary/Tertiary bus locations Verify proper separation of wire bundles exist Required separation from fuel quantity indicator system (FQIS) to prevent fuel tang ignition Enhanced Zonal Analysis Procedure (EZAP) Performed EZAP was developed by the Aging Transport Systems Rulemaking Advisory Committee (ATSRAC) EZAP is the method for analyzing airplane zones with an emphasis on evaluating wiring systems and the existence of combustibles  in the cabin. Certification Considerations for Wi-Fi Systems Electrical – All existing DO 160 testing required Issue papers required Onboard EMI testing – any interference with aircraft systems when multiple wi-fi users are logged on? Vibration/Buffeting compliance criteria – what is the effect of the antenna on aircraft flight characteristics? Structural certification criteria – what are the stress loads on the aircraft at the antenna location and what is the impact on maintenance inspection criteria for the airline? Damage tolerance analysis required Goal – minimize maintenance inspection intervals

    Read the article

  • Internet is far slower in Ubuntu than Windows 7 on dual-booted machine

    - by Tim
    Edit: I'll leave the original post as-is, but after further investigation, it appears that the problem is something to do with my wi-fi card. Speeds are normal when I connect via cable. Edit 2: Problem was solved. It was something to do with the wireless card drivers. I normally use Windows 7 on my laptop and have internet speeds that are normally about 15-20 Mb/s. I have recently dual-booted with Ubuntu 12.10, and have noticed that internet speeds are drastically slower in Ubuntu. When tested, speeds range from 0.2-2 Mb/s, although occasionally being significantly faster than that or even stopping completely for short periods of time. I've also noticed that when first booting into Ubuntu, speeds start fairly fast, and drop to incredibly slow with a few seconds to a few minutes. There's still some possibility that the issue may be with my ISP, as things seem slower than usual even in Windows, but I suspect that it is related to Ubuntu, as things are far slower in Ubuntu than in Windows. I'm wondering, what could be the cause of this? Potentially relevant information: -I've dual booted before on this machine with earlier versions of Ubuntu (different ISP at the time) with no problem. ISP: Rogers (Major Canadian ISP) System info (Gateway NV53a Laptop): Operating System MS Windows 7 Home Premium 64-bit CPU AMD Phenom II N970 Caspian 45nm Technology RAM 6.00 GB Dual-Channel DDR3 @ 664MHz (9-9-9-24) Motherboard Gateway SJV51_DN (Socket S1G4) Graphics Generic PnP Monitor (1366x768@60Hz) ATI Mobility Radeon HD 4250 (Acer Incorporated [ALI]) Hard Drives 733GB TOSHIBA TOSHIBA MK7559GSXP ATA Device (SATA) Networking info: Connected through Wi-Fi Atheros AR5B97 Wireless Network A

    Read the article

  • How to Print From Metro Apps in Windows 8

    - by Taylor Gibb
    Printing has become an application aware feature in Metro applications. This makes the outcome of a print job different from application to application, but the question remains, how do you print? Using the Keyboard Not all apps support printing in Windows 8, a good example of one that does is Mail. So fire up the default Mail app select an email you want to print. When you are ready, go ahead and press the ctrl + P keyboard combination. This will bring up a list of available print devices on the right-hand side, you can use the up and down arrows to select a printer. You will get most of the options you are use to when printing, so once you have set up your preferences go ahead and hit the print button. Using the Mouse If you would rather use your mouse, move it to the bottom right hand corner of your screen, which will bring up the Charms bar, from here you will need to click on the devices charm. Using this will list your printers as well as other devices, so make sure you select a printer. That’s all there is to it. How To Switch Webmail Providers Without Losing All Your Email How To Force Windows Applications to Use a Specific CPU HTG Explains: Is UPnP a Security Risk?

    Read the article

  • Price Drop for Processor based License on Exalytics

    - by Mike.Hallett(at)Oracle-BI&EPM
    ·       33% reduction in the list `per processor` license pricing for the Oracle BI Foundation Suite ·       New capacity-based licensing which allows customers to think big & start small, significantly lowering the entry price point for an Exalytics. Oracle BI Software List Price changes In response to new powerful platforms like the in-memory Oracle Exalytics with 40 cpu cores (counted under Oracle pricing policy as 20 “processors”), the list price of “Oracle BI Foundation Suite” (BIFS) is reduced by 33% from $450K per processor to $300K per processor. Capacity-based licensing on Exalytics (Trusted Partitions) “Capacity-based pricing” for the BIFS, Endeca, Essbase and Times Ten for Exalytics software is now available for Exalytics systems. This is delivered using “Oracle VM” (OVM).  We still ship a full Exalytics machine to all customers, but they may choose to only use and license a subset of the processors installed in the machine.   Customers can license Exalytics software in units of 5 “processors”: 5, 10, 15 or the full capacity 20.   As the customer’s implementation and workload increases, it is a simple matter to license additional processors and, using OVM, make them available to the BI or EPM application. Endeca Information Discovery now available on Exalytics Oracle has also announced the certification of “Oracle Endeca Information Discovery” (EID) on the Exalytics machine.    EID can be licensed alone or in combination with the BIFS & Times Ten for an Exalytics stack, and also participates in the capacity based pricing outlined above.   The Exalytics hardware is the perfect platform for EID, and provides superb power and performance for this in-memory hybrid text-search-analytics.   For more information : Oracle Price lists Oracle Partitioning Policy Discussion by Mark Rittman (Rittman Mead Consulting ltd.) on Oracle Trusted Partitions for Oracle Engineered Systems, Oracle Exalytics and Updated BI Foundation Pricing.

    Read the article

  • Partner Training for Oracle Business Intelligence Applications 4-Day Bootcamp

    - by Mike.Hallett(at)Oracle-BI&EPM
    Partners 4-Day training from 15th - 18th October 2012, at Oracle Reading (UK) The Oracle Business Intelligence Applications provide pre-built Operational BI solutions for eBusiness Suite, Peoplesoft, Siebel, JDE and SAP; offering out-of-the-box integration. This FREE for Partners 4-Day training will provide attendees with an in-depth working understanding of the architecture, the technical and the functional content of the Oracle Business Intelligence Applications, whilst also providing an understanding of their installation, configuration and extension. The course will cover the following topics: Overview of Oracle Business Intelligence Applications Oracle BI Applications Fundamentals and Features Configuring BI Applications for Oracle E-Business Suite Understanding BI Applications Architecture Fundamentals of BI Applications Security   REGISTER HERE NOW    (acceptance is subject to availability and your place will be confirmed within two weeks: for help see the Partner Registration Guide) Location: Bray Room, at Oracle Corporation UK Ltd Oracle Parkway Thames Valley Park Reading, Berkshire RG6 1RA 15th - 18th October 2012, 4-Days :  9:30 am – 5:00 pm BST Audience The seminar is aimed at BI Consultants and Implementation Consultants within Oracle's Gold and Platinum Partners. Good understanding of basic data warehousing concepts Hands on experience in Oracle Business Intelligence Enterprise Edition Hands on experience in Informatica Some understanding of  Oracle BI Applications is required (See Sales & Technical Tutorials for OBI, BI-Apps and Hyperion EPM)  Good understanding of any of the following Oracle EBS modules: General Ledger, Accounts Receivables, Accounts Payables Please note that attendees are required to bring a laptop: 4GB RAM Windows 64 bits 80GB free space in Hard drive or External Device CPU Core 2 Duo or Higher Windows 7, Windows XP, Windows 2003 NOT ALLOWED with Windows Vista An Administrator User For more information please contact [email protected].

    Read the article

  • How do I find out which process is eating up my bandwidth?

    - by Bruce Connor
    I think I'm being the victim of a bug here. Sometimes while I'm working (I still don't know why), my network traffic goes up to 200 KB/s and stays that way, even tough I'm not doing anything internet-related. This sometimes happens to me with the CPU usage. When it does, I just run a top command to find out which process is responsible and then kill it. Problem is: I have no way of knowing which process is responsible for my high network usage. Both the resource monitor and the top command only tell me my total network usage, neither of them tells me process specific network info. Is there another command I can use to find out which process is getting out of hand? I've already tried killing all the obvious ones (firefox, update-manager, pidgin, etc) with no luck. So far, restarting the machine is the only way I found of getting rid of the issue. EDIT: (just to be clear) I've found questions here about monitoring total bandwidth usage, but, as I mentioned, that's not what I need. UPDATE: The command iftop gives results that disagree entirely with the information reported by System Monitor. While the latter claims there's high network traffic, the former claims there's barely 1 KB/s. Thanks

    Read the article

  • Investigating on xVelocity (VertiPaq) column size

    - by Marco Russo (SQLBI)
      In January I published an article about how to optimize high cardinality columns in VertiPaq. In the meantime, VertiPaq has been rebranded to xVelocity: the official name is now “xVelocity in-memory analytics engine (VertiPaq)” but using xVelocity and VertiPaq when we talk about Analysis Services has the same meaning. In this post I’ll show how to investigate on columns size of an existing Tabular database so that you can find the most important columns to be optimized. A first approach can be looking in the DataDir of Analysis Services and look for the folder containing the database. Then, look for the biggest files in all subfolders and you will find the name of a file that contains the name of the most expensive column. However, this heuristic process is not very optimized. A better approach is using a DMV that provides the exact information. For example, by using the following query (open SSMS, open an MDX query on the database you are interested to and execute it) you will see all database objects sorted by used size in a descending way. SELECT * FROM $SYSTEM.DISCOVER_STORAGE_TABLE_COLUMN_SEGMENTS ORDER BY used_size DESC You can look at the first rows in order to understand what are the most expensive columns in your tabular model. The interesting data provided are: TABLE_ID: it is the name of the object – it can be also a dictionary or an index COLUMN_ID: it is the column name the object belongs to – you can also see ID_TO_POS and POS_TO_ID in case they refer to internal indexes RECORDS_COUNT: it is the number of rows in the column USED_SIZE: it is the used memory for the object By looking at the ration between USED_SIZE and RECORDS_COUNT you can understand what you can do in order to optimize your tabular model. Your options are: Remove the column. Yes, if it contains data you will never use in a query, simply remove the column from the tabular model Change granularity. If you are tracking time and you included milliseconds but seconds would be enough, round the data source column to the nearest second. If you have a floating point number but two decimals are good enough (i.e. the temperature), round the number to the nearest decimal is relevant to you. Split the column. Create two or more columns that have to be combined together in order to produce the original value. This technique is described in VertiPaq optimization article. Sort the table by that column. When you read the data source, you might consider sorting data by this column, so that the compression will be more efficient. However, this technique works better on columns that don’t have too many distinct values and you will probably move the problem to another column. Sorting data starting from the lower density columns (those with a few number of distinct values) and going to higher density columns (those with high cardinality) is the technique that provides the best compression ratio. After the optimization you should be able to reduce the used size and improve the count/size ration you measured before. If you are interested in a longer discussion about internal storage in VertiPaq and you want understand why this approach can save you space (and time), you can attend my 24 Hours of PASS session “VertiPaq Under the Hood” on March 21 at 08:00 GMT.

    Read the article

  • Why does video playback lag/freeze when I go into full-screen mode?

    - by RanRag
    When I try to play my video files in SMPlayer it works fine but as soon as I switch to fullscreen mode(16:9) following thing happens: 1) Video starts lagging. 2) Audio and video goes out of sync. 3) CPU usage rises to ~50%. 4) SMPlayer starts to hang. My current SMPlayer configuration: 1)Video Output Driver = x11(slow) 2)Audio Output Driver = alsa(0.0-HDA Intel) 3)Cache = 8192 KB 4)Threads for decoding(MPEG-1/2 and H.264 only = 2 Things I tried solve this problem: 1) Tried changing video o/p driver to xv,gl. 2) Tried changing audio o/p driver to pulse. 3) Tried increasing cache size and also tried using nocache. Everything works fine on windows but I don't want to switch to windows just to play video files. My system config: Acer Aspire One D270 Atom N2600(Cedar Trail) 1.6GHz 2GB Memory Intel GMA 3600 graphics. Ubuntu 12.04 Kernel Release: 3.2.0-23-generic-pae Rest all things are working fine I have no resolution issue, bluetooth, wireless also working fine. Just ask me to submit any other log file I will be happy to post. SMPlayer log MPlayer Terminal output Codec Information(currently playing file):

    Read the article

  • Intel 5100 AGN disconnects and then disables my Verizon FiOS Actiontec router

    - by Anthony
    I am a new user (my first trial) of Ubuntu and booted up my Windows Vista laptop up with the Ubuntu CD. I was able to connect wirelessly to my router and stay connected for about a minute. After that it would disconnect and my router would be disabled: none of my other wireless devices would see the signal any longer, as if it disappeared. I would have to cycle the router's power toggle off/on to get it to come back on and put out a signal again. This happened repeatedly. I experienced no other problems trying the software (i.e., accessed my files w/o issue). I did not attempt to connect with an Ethernet cable. Here are the specs on my system: laptop is HP Pavilion dv5 Notebook PC system type is 64-bit operating system CPU is Intel Core 2 Duo P8600 2.4GHz ram is 4 GB network card is Intel WiFi Link 5100 AGN I read elsewhere in this forum that 64-bit systems may not be compatible with Ubuntu. Can anyone help me with this? I'd really like to be able to use this op system.

    Read the article

  • How can I set my resolution to 1280x1024 on an Acer Aspire Revo 3700?

    - by torbengb
    I've just set up a new nettop computer (Acer Aspire Revo 3700: CPU:Atom D525, GPU:Nvidia ION2). I've just made a clean install of Ubuntu 10.10 using the standard USB pendrive method. Almost everything works OK, but the graphics are not OK: the recommended Nvidia driver is activated but the monitor is not detected, so the resolution is wrong. How can I make Ubuntu detect my monitor? How can I get the proper resolution (1280x1024) in Ubuntu? I know that my monitor is not a CRT but an LCD: it's a BenQ, model T905, with 1280x1024 resolution at 60Hz, connected via a normal VGA cable. DVI or HDMI is not an option. When I go to SystemPrefsMonitors, I get: It appears that your graphics driver does not support the necessary extensions to use this tool. Do you want to use your graphics driver vendor's tool instead? YES NO If I say NO then I get a window: or for YES I get this: In both cases I don't see that I can fix this problem. The main reason for getting this new computer was that I was sick of having graphics problems on the old one with a very ugly solution that didn't give me hardware support - but at least I got the resultion. Why is this so difficult... sigh!

    Read the article

  • HP Pavilion 15 with AMD dual graphics - Ubuntu live environment not starting

    - by creepus
    I've had this laptop for about a day now and have decided to try Ubuntu on it and determine if I want to install it. I created a USB, it booted (Secure Boot was on, I tried with Secure Boot off to no effect), and then the problem occurred. The screen turned off for a second, turned back on to a black screen, shut off again and turned back on with a dialogue box telling me that the system had to use low-graphics mode. I clicked OK, selected low-graphics mode from the menu and clicked OK. The screen switched to the boot messages and did not go any further than this. Ctrl+Alt+DEL started rebooting the laptop though. I tried booting again, but this time I edited the boot options in GRUB to add nomodeset. This time, the laptop only booted to a black screen. Ctrl+Alt+F2 took me to a prompt, I tried startx from there, but X didn't start, complaining that it wanted kernel mode setting back. I can not seem to find any option to disable one graphics chip or the other in the UEFI setup menus. Laptop : HP Pavilion 15-E004AU. The CPU : AMD A6-4400M APU with Radeon(tm) HD Graphics The graphics chip : AMD Radeon HD 7520G + 8670M Dual Graphics. The Ubuntu version : 13.10, 64 bit. Thanks. EDIT: I tried 12.04.3 LTS, it managed to bring the desktop up. There are severe graphics glitches after about two minutes though.

    Read the article

  • Using Google App Engine to Perform World Updates vs an Authoritative Server

    - by Error 454
    I am considering different game server architectures that use GAE. The types of games I am considering are turn-based where the world status would need to be updated about once per minute. I am looking for an answer that persuades me to either perform the world update on the google servers OR an authoritative server that syncs with the datastore. The main goal here would be to minimize GAE daily quotas. For some rough numbers, I am assuming 10,000 entities requiring updates. Each entity update would require: Reading 5 private entity variables (fetched from datastore) Fetching as many as 20 static variables (from datastore or persisted in server memory) Writing 5 entity variables Clients of the game would authenticate and set state directly against GAE as well as pull the latest world state from GAE. Running the update on GAE would consist of a cron job launched every minute. This would update all of the entities and save the results to the datastore. This would be more CPU intensive for GAE. Running the update on an authoritative server would consist of fetching entity data from the GAE datastore, calculating the new entity states and pushing the new state variables back to the datastore. This would be more bandwidth intensive for the datastore.

    Read the article

  • Video lags/freezes in SMPlayer and VLC

    - by RanRag
    When I try to play my video files in SMPlayer it works fine but as soon as I switch to fullscreen mode(16:9) following thing happens: 1) Video starts lagging. 2) Audio and video goes out of sync. 3) CPU usage rises to ~50%. 4) SMPlayer starts to hang. My current SMPlayer configuration: 1)Video Output Driver = x11(slow) 2)Audio Output Driver = alsa(0.0-HDA Intel) 3)Cache = 8192 KB 4)Threads for decoding(MPEG-1/2 and H.264 only = 2 Things I tried solve this problem: 1) Tried changing video o/p driver to xv,gl. 2) Tried changing audio o/p driver to pulse. 3) Tried increasing cache size and also tried using nocache. Everything works fine on windows but I don't want to switch to windows just to play video files. My system config: Acer Aspire One D270 Atom N2600(Cedar Trail) 1.6GHz 2GB Memory Intel GMA 3600 graphics. Ubuntu 12.04 Kernel Release: 3.2.0-23-generic-pae Rest all things are working fine I have no resolution issue, bluetooth, wireless also working fine. Just ask me to submit any other log file I will be happy to post. SMPlayer log MPlayer Terminal output Codec Information(currently playing file):

    Read the article

  • How to document requirements for an API systematically?

    - by Heinrich
    I am currently working on a project, where I have to analyze the requirements of two given IT systems, that use cloud computing, for a Cloud API. In other words, I have to analyze what requirements these systems have for a Cloud API, such that they would be able to switch it, while being able to accomplish their current goals. Let me give you an example for some informal requirements of Project A: When starting virtual machines in the cloud through the API, it must be possible to specify the memory size, CPU type, operating system and a SSH key for the root user. It must be possible to monitor the inbound and outbound network traffic per hour per virtual machine. The API must support the assignment of public IPs to a virtual machine and the retrieval of the public IPs. ... In a later stage of the project I will analyze some Cloud Computing standards that standardize cloud APIs to find out where possible shortcomings in the current standards are. A finding could and will probably be, that a certain standard does not support monitoring resource usage and thus is not currently usable. I am currently trying to find a way to systematically write down and classify my requirements. I feel that the way I currently have them written down (like the three points above) is too informal. I have read in a couple of requirements enineering and software architecture books, but they all focus too much on details and implementation. I do really only care about the functionalities provided through the API/interface and I don't think UML diagrams etc. are the right choice for me. I think currently the requirements that I collected can be described as user stories, but is that already enough for a sophisticated requirements analysis? Probably I should go "one level deeper" ... Any advice/learning resources for me?

    Read the article

  • Javascript Canvas Drawing Efficiency

    - by jujumbura
    I have just recently started some experiments with game development in Javascript/HTML5, and so far it has been going pretty well. I have a simple test scene running with some basic input handling, and a hundred-ish drawImage() calls with a few transforms. This all runs great on Chrome, but unfortunately, it already chugs on Firefox. I am using a very large canvas ( 1920 x 1080 ), but it doesn't seem like I should be hitting my limit already. So on that note, I was hoping to ask a few questions: 1) What exactly is done on the CPU vs. the GPU in terms of canvas and drawImage()? I'm afraid the answer is probably "it depends on the browser", but can anybody give me some rules of thumb? I naively imagined that each drawImage call results in a textured quad on the GPU with the canvas effectively being a render target, but I'm wondering if I'm pretty far off base there... 2) I have seen posts here and there with people saying not to use the translate(), rotate(), scale() functions when drawing on the canvas. Am I adding a lot of overhead just by adding a translate() call, as opposed to passing in the x,y to drawImage()? Some people suggest using "transate3d", etc., which are CSS properties, but I'm not sure how to use them within a scene. Can they be used for animated sprites within a single canvas? 3) I have also seen a lot of posts with people mentioning that pre-building canvases and then re-using them is a lot faster than issuing all the individual draw calls again. I am guessing that my background should definitely be pre-built into a canvas, but how far should I take this? Should I maintain an individual canvas for each sprite, to cache all static image data when not animating? Thank you much for your advice!

    Read the article

  • Ubuntu 12.04 server froze during the first boot after it was installed.

    - by user69021
    I installed Ubuntu server 12.04 to my new server and it failed on the first boot. It just stopped and I can't proceed any further. Server's specifications: Dell PowerEdge T620 CPU : Xeon E5-2665 2.4G x 2 RAM : 8GB RDIMM, 1333MHz x 12 HDD : 3TB Near Line SAS 7.2K x 8 RAID controller : PERC H710 GPU : NVIDIA Tesla C2075 x 4 I have a screenshot of the screen it stopped on but I cannot attach it because my privilege level is currently too low. ![freeze on boot][1] Here are the last messages while booting. [5.048743] Freeing unused kernel memory : 920k freed [5.049046] Write protecting the kernel read-only data : 12288k [5.052973] Freeing unused kernel memory : 1608k freed [5.056132] Freeing unused kernel memory : 1196k freed Loading, please wait... [5.070236] udevd[218]: starting version 175 Begin: Loading essential drivers ... done. Begin: Running /scripts/init-premount ... done. Begin: Mounting root file system ... Begin: Running /scripts/local-top ... done. [5.089030] megasas: 00.00.06.12-rc1 Wed. Oct. 5 17:00:00 PDT 2011 [5.089518] megasas: 0x1000:0x005b:0x1028:0x1f35: bus 1:slot 0:func 0 [5.089739] megaraid_sas 0000:01:00.0: PCI INT A -> GSI 34 (level, low) -> IRQ 34 [5.089937] megasas: FW now in Ready state [5.090427] dca service started, version 1.12.1 [5.091463] Intel(R) Gigabit Ethernet Network Driver - version 3.2.10-k [5.091578] Copyright (c) 2007-2011 Intel Corporation. [5.091712] igb 0000:06:00.0: PCI INT A -> GSI 16 (level,low) -> IRQ 16 [5.111090] megasas:IOC Init cmd success [5.123124] usb 1-1:new high-speed USB device number 2 using ehci_hcd What can I do about this?

    Read the article

  • Docker vs ESXi for Startup Projects - Deploying Code for Dev Testing

    - by JasonG
    Why hello there little programmer dude! I have a question for you and all of your experience and knowledge. I have an ESXi whitebox that I built which is an 8 dude that sits in the corner. I made a mistake recently and took the key that had ESXi, formatted it and used it for something else. No big deal because the last project I worked on had stalled out. I'm about to pick up another project and now I need to spin up a whole bunch of stuff for CI, qa + db, ticket tracker, wikis etc etc. I've been hearing a lot about Docker recently and as this is just a consumer grade machine, I'm wondering if it may make more sense for me to use Docker on OpenOS and then put everything there - bamboo or hudson, jira, confluence, postgress for the tools to use, then a qa env. I can't really seem to find any documents that directly compare traditional VM infrastructure vs docker solutions and I'm wondering if it is fair to compare. Is there any reason why CoreOS w/ containers would be a strictly worse solution? Or do you have any insight into why I may want to stick with ESXi? I've looked on multiple occasions and can't find a good reason not to. I'm not going to run a production env on the server so I don't need to have HA if updating security or OS for example where esxi would allow me to restart one vm at a time. I can just shut the thing down and bring it back up if I need a reboot no problem. So what's up with this container stuff? Is it a fair replacement for ESXi? I'm guessing the atlassian products would run much better and my ram would go a lot farther using docker. Probably the CPU would run much cooler too and my expensive HDD space would be better utilized.

    Read the article

  • Can't run minecraft on ubuntu 12.04 lts [duplicate]

    - by user170011
    This question already has an answer here: How to correctly install and troubleshoot Minecraft (Client) 3 answers I was trying to run minecraft on my laptop with ubuntu 12.04 lts 64 bit. I have a lenovo ideapad p580 with 7.7 Gb and an Intel® Core™ i7-3520M CPU @ 2.90GHz × 4 processor. Under the graphics section of the system overview in ubuntu it says I have none installed. My computer comes with and nvidia geforce graphics card but it isnt recognized. When I start minecraft I get this crash report. ---- Minecraft Crash Report ---- // Shall we play a game? Time: 24/06/13 7:23 PM Description: Failed to start game org.lwjgl.LWJGLException: Could not init GLX at org.lwjgl.opengl.LinuxDisplayPeerInfo.initDefaultPeerInfo(Native Method) at org.lwjgl.opengl.LinuxDisplayPeerInfo.<init>(LinuxDisplayPeerInfo.java:52) at org.lwjgl.opengl.LinuxDisplay.createPeerInfo(LinuxDisplay.java:684) at org.lwjgl.opengl.Display.create(Display.java:854) at org.lwjgl.opengl.Display.create(Display.java:784) at org.lwjgl.opengl.Display.create(Display.java:765) at net.minecraft.client.Minecraft.a(SourceFile:235) at avv.a(SourceFile:56) at net.minecraft.client.Minecraft.run(SourceFile:507) at java.lang.Thread.run(Thread.java:679) A detailed walkthrough of the error, its code path and all known details is as follows: -- System Details -- Details: Minecraft Version: 1.5.2 Operating System: Linux (amd64) version 3.5.0-34-generic Java Version: 1.6.0_27, Sun Microsystems Inc. Java VM Version: OpenJDK 64-Bit Server VM (mixed mode), Sun Microsystems Inc. Memory: 406175448 bytes (387 MB) / 514523136 bytes (490 MB) up to 1908932608 bytes (1820 MB) JVM Flags: 2 total; -Xmx2048M -Xms512M AABB Pool Size: 0 (0 bytes; 0 MB) allocated, 0 (0 bytes; 0 MB) used Suspicious classes: No suspicious classes found. IntCache: cache: 0, tcache: 0, allocated: 0, tallocated: 0 LWJGL: 2.4.2 OpenGL: ~~ERROR~~ NullPointerException: null Is Modded: Probably not. Jar signature remains and client brand is untouched. Type: Client (map_client.txt) Texture Pack: Default Profiler Position: N/A (disabled) Vec3 Pool Size: ~~ERROR~~ NullPointerException: null I can run it on different versions of linux such as fedora.

    Read the article

  • Ubuntu 12.04 Frequent Kernel Panics

    - by Jordan Johns
    Dealing with what seems to be quantum behavior here. I'm having frequent Kernel panics and the problem is very hard to locate. To cut to the chase of what is going on I am essentially getting this screen: This occurs always after login (login screen is fine, can go into my TTY's and everything). But once I log in I have anywhere from 0 (instant) to 10 seconds before everything grinds to a halt and freezes. The only way I found out that it was a kernel panic was going into a TTY while it happens. At first, I figured something must be wrong with memory. Ran memtest86, full standard test. No errors. OK. CPU problem? Windows works fine, ran prime95 and a bunch of other torture tests for many hours. Drive problem? Nope. fsck reports no problems and the smart status is ok. Overclocking problem? Nope. Windows is working fine, and I also reset the bios to prove the point it was not the cause. At a loss here, even trying to reinstall the thing it gives me the same problem even attempting to install ubuntu (like right after you try to select "install ubuntu" or "try ubuntu first before installing"). Anyone have any thoughts? Its very bizarre, its as if my system hates linux and is trying to tell me not to use it (that would be a terrible reality). Thanks!

    Read the article

  • invitation: Oracle Endeca Information Discovery Bootcamp

    - by mseika
    The Oracle Endeca Information Discovery (OEID) Boot Camp is designed to give partners an understanding of OEID’s features, and how it complements the existing Oracle Business Intelligence suite. Participants will learn how to develop & implement solutions using a Data Discovery method. Training is in EnglishWhat will be covered?The Oracle Endeca Information Discovery (OEID) Boot Camp is a three-day class with a combination of lecture and hands-on exercises, tailored to make participants aware of the Oracle Endeca Information Discovery platform, and to gain valuable skills for the implementation of projects.The course will follow a combination of lectures and hands-on lab sessions, to allow participants to apply the knowledge they have gained by extracting from sample data sources, and creating an end-user application that will be used to answer several business questions. What You Will Learn Architecture: OEID Components, use of graphs, overview of clustering OEID Installation: Architecture planning, infrastructure requirements, installation process, production hints & tips OEID Administration: Data store management, administrative operations, portal configuration, data sources, system monitoring Indexing: Integration Suite, Data source analysis, Graph (ETL) creation, record design techniques Portlets: Studio portlets, custom portlet development, querying functions Reporting: Studio applications & best practices, visualizations, EQL PrerequisitesYou must bring a laptop with you for the Hands-on labs ENVIRONMENT – LAPTOP REQUIREMENTS For the OEID boot camp, participants will perform the hands-on lab exercises using a virtual machine image. These virtual machines will be provided to participants within a cloud environment, requiring participants to bring a laptop to the Boot Camp that can access a Windows server utilizing Microsoft RDP from their laptop. Participants will not need to install any software onto their laptops, but must ensure that they have the proper software installed for their OS, to connect through RDP to a server. HARDWARE • CPU: Dual-core, x64, 1.8Ghz or higher • RAM: 2GB SOFTWARE • Microsoft Remote Desktop Client • Internet Explorer 7, Firefox, or Google Chrome This boot camp is intended for prospective implementers of Oracle Endeca Information Discovery (OEID), or those in a presales role looking to gain insight into the technical benefits of this new package. Attendees should have experience and familiarity with the basic concepts of business intelligence. Where and When ? Monday, October 15th until wednesday, October 17th included 9:00 - 18:00 Oracle France 15, boulevard Charles de Gaulle 92715 Colombes Access Register Here Limited number of seats !

    Read the article

< Previous Page | 162 163 164 165 166 167 168 169 170 171 172 173  | Next Page >