Search Results

Search found 7129 results on 286 pages for 'battery usage'.

Page 77/286 | < Previous Page | 73 74 75 76 77 78 79 80 81 82 83 84  | Next Page >

  • How to maximize Java resources for gaming?

    - by Keidax
    I notice that when I play Minecraft, my CPU usage is only around 15 to 20 percent, at most. Is it possible to force my computer to allocate as much RAM, CPU time, etc. to the game, in order to get the best experience possible? Or, is the low CPU usage due to limitations in the JVM or the game code itself? I have OptiFine installed, and I've already tried using renice to change the process priority and -Xmx to allocate more memory to Java. What else can I do?

    Read the article

  • Reasons for either 32-bit or 64-bit as development machine

    - by vartec
    I'm about to make a new Linux install, which will be primarily used for programming. I've seen benchmarks showing speed improvement of 64-bit version, however, I have hard time of telling how much these benchmarks translate to improvement in every day usage. And of course there are other aspects to consider. Usage I have in mind: mainly programming Python, with occasional C, C++ and Java; IDEs, which are using Java platforms (Eclipse and IntelliJ); on very rare occasions having to compile for 32-bit platform; not planning to have more than 64GB of RAM anytime soon (and I don't mind using PAE kernels); machine in question has 4GB RAM and Athlon II X2; What are pros and cons of choosing either i386 or x86_64 distro?

    Read the article

  • Dealing with coworkers when developing, need advice [closed]

    - by Yippie-Kai-Yay
    I developed our current project architecture and started developing it on my own (reaching something like, revision 40). We're developing a simple subway routing framework and my design seemed to be done extremely well - several main models, corresponding views, main logic and data structures were modeled "as they should be" and fully separated from rendering, algorithmic part was also implemented apart from the main models and had a minor number of intersection points. I would call that design scalable, customizable, easy-to-implement, interacting mostly based on the "black box interaction" and, well, very nice. Now, what was done: I started some implementations of the corresponding interfaces, ported some convenient libraries and wrote implementation stubs for some application parts. I had the document describing coding style and examples of that coding style usage (my own written code). I forced the usage of more or less modern C++ development techniques, including no-delete code (wrapped via smart pointers) and etc. I documented the purpose of concrete interface implementations and how they should be used. Unit tests (mostly, integration tests, because there wasn't a lot of "actual" code) and a set of mocks for all the core abstractions. I was absent for 12 days. What do we have now (the project was developed by 4 other members of the team): 3 different coding styles all over the project (I guess, two of them agreed to use the same style :), same applies to the naming of our abstractions (e.g CommonPathData.h, SubwaySchemeStructures.h), which are basically headers declaring some data structures. Absolute lack of documentation for the recently implemented parts. What I could recently call a single-purpose-abstraction now handles at least 2 different types of events, has tight coupling with other parts and so on. Half of the used interfaces now contain member variables (sic!). Raw pointer usage almost everywhere. Unit tests disabled, because "(Rev.57) They are unnecessary for this project". ... (that's probably not everything). Commit history shows that my design was interpreted as an overkill and people started combining it with personal bicycles and reimplemented wheels and then had problems integrating code chunks. Now - the project still does only a small amount of what it has to do, we have severe integration problems, I assume some memory leaks. Is there anything possible to do in this case? I do realize that all my efforts didn't have any benefit, but the deadline is pretty soon and we have to do something. Did someone have a similar situation? Basically I thought that a good (well, I did everything that I could) start for the project would probably lead to something nice, however, I understand that I'm wrong. Any advice would be appreciated, sorry for my bad english.

    Read the article

  • Nominations for Oracle's Eco-Enterprise Innovation Awards- Due July 17, 2012

    - by swalker
    Are you working with a customer that is using any of Oracle's products to reduce their environmental footprint while improving their operational efficiency? Reducing energy usage? Reducing gas usage? Going paperless? Both you and your customer may be eligible for Oracle's Eco-Enterprise Innovation Award, part of the Oracle Excellence awards. Get more details and submit a nomination form here by July 17. These awards will be presented during Oracle OpenWorld by Jeff Henley, Oracle Chairman of the Board, in a special conference session. Winning customers will receive a free Oracle OpenWorld registration pass.

    Read the article

  • Frequent GUI pauses in Ubuntu 13.04 / Unity / Intel HD4000

    - by Simon
    I'm experiencing very frequent (and regular) GUI pauses on my system. Every 30 seconds (pretty much exactly) the GUI will freeze for maybe .25 to .5 seconds. The mouse stops moving, keys stop echoing and a stopwatch timer briefly pauses. I'm using the Intel Graphics driver available from: https://download.01.org/gfx/ubuntu/13.04/main I've looked in a few places and tried a few things for a solution: I've checked cron and anacron for scheduled processes. I've disabled background processes (eg mysql, postgres, apache) not that these were doing anything anyway I've checked the following posts and tried the suggestions there: Unity GUI pauses/freezes for less than a few seconds How to go about troubleshooting frequent system pauses I've watched the system using top and System Monitor and there are no spikes (or even blips) of cpu usage when the pauses occur. There are no obvious error messages in dmesg or syslog There is loads of free RAM (8GB+) and no swap usage If it helps it's a ZooStorm i5 laptop with a HD4000 GPU, 16GB Ram and an SSD. Any help / suggestions would be very gratefully received.

    Read the article

  • Issues with LVM partition size in Server 13.04

    - by Michael
    I am new to ubuntu and a little confused about how hard drive partitions and LVM works. I remember setting up Ubuntu server 13.04 and telling to to use 1TB of a 3TB server. Well I have maxed that out with blu-ray rips and want the rest of the drive for space. On log-in it says: System load: 2.24 Processes: 179 Usage of /: 88.7% of 912.89GB Users logged in: 0 Memory usage: 6% IP address for p5p1: 192.168.0.100 Swap usage: 0% => / is using 88.7% of 912.89GB lvdisplay outputs: --- Logical volume --- LV Path /dev/DeathStar-vg/root LV Name root VG Name DeathStar-vg LV Write Access read/write LV Creation host, time DeathStar, 2013-05-18 22:21:11 -0400 LV Status available # open 1 LV Size 2.70 TiB Current LE 707789 Segments 2 Allocation inherit Read ahead sectors auto - currently set to 256 Block device 252:0 --- Logical volume --- LV Path /dev/DeathStar-vg/swap_1 LV Name swap_1 VG Name DeathStar-vg LV Write Access read/write LV Creation host, time DeathStar, 2013-05-18 22:21:11 -0400 LV Status available # open 2 LV Size 3.75 GiB Current LE 959 Segments 1 Allocation inherit Read ahead sectors auto - currently set to 256 Block device 252:1 vgdisplay outputs: VG Name DeathStar-vg System ID Format lvm2 Metadata Areas 1 Metadata Sequence No 4 VG Access read/write VG Status resizable MAX LV 0 Cur LV 2 Open LV 2 Max PV 0 Cur PV 1 Act PV 1 VG Size 2.73 TiB PE Size 4.00 MiB Total PE 715335 Alloc PE / Size 708748 / 2.70 TiB Free PE / Size 6587 / 25.73 GiB df outputs: Filesystem 1K-blocks Used Available Use% Mounted on /dev/mapper/DeathStar--vg-root 957238932 848972636 59634696 94% / none 4 0 4 0% /sys/fs/cgroup udev 1864716 4 1864712 1% /dev tmpfs 374968 1060 373908 1% /run none 5120 4 5116 1% /run/lock none 1874824 148 1874676 1% /run/shm none 102400 24 102376 1% /run/user /dev/sda2 234153 56477 165184 26% /boot And fdisk /dev/sda -l outputs: Disk /dev/sda: 3000.6 GB, 3000592982016 bytes 255 heads, 63 sectors/track, 364801 cylinders, total 5860533168 sectors Units = sectors of 1 * 512 = 512 bytes Sector size (logical/physical): 512 bytes / 4096 bytes I/O size (minimum/optimal): 4096 bytes / 4096 bytes Disk identifier: 0x00000000 Device Boot Start End Blocks Id System /dev/sda1 1 4294967295 2147483647+ ee GPT Partition 1 does not start on physical sector boundary. I just don't know what to make of all this and am not sure how I can make it use all 2.73TBs. Thanks in advance for any help. EDIT-- Yes I did make changes to the LVM Config, but it didnt do anything. As requested, output of parted -l /dev/sda Model: ATA WDC WD30EFRX-68A (scsi) Disk /dev/sda: 3001GB Sector size (logical/physical): 512B/4096B Partition Table: gpt Number Start End Size File system Name Flags 1 1049kB 2097kB 1049kB bios_grub 2 2097kB 258MB 256MB ext2 3 258MB 3001GB 3000GB lvm Model: ATA WDC WD30EFRX-68A (scsi) Disk /dev/sdb: 3001GB Sector size (logical/physical): 512B/4096B Partition Table: msdos Number Start End Size Type File system Flags Model: Linux device-mapper (linear) (dm) Disk /dev/mapper/DeathStar--vg-swap_1: 4022MB Sector size (logical/physical): 512B/4096B Partition Table: loop Number Start End Size File system Flags 1 0.00B 4022MB 4022MB linux-swap(v1) Model: Linux device-mapper (linear) (dm) Disk /dev/mapper/DeathStar--vg-root: 2969GB Sector size (logical/physical): 512B/4096B Partition Table: loop Number Start End Size File system Flags 1 0.00B 2969GB 2969GB ext4

    Read the article

  • How to Use the New Task Manager in Windows 8

    - by Chris Hoffman
    The Task Manager in Windows 8 has been completely overhauled. It’s easier-to-use, slicker, and more feature-packed than ever. Windows 8 may be all about Metro, but the Task Manager and Windows Explorer are better than ever. The Task Manager now manages startup programs, shows your IP address, and displays slick resource usage graphs. The new color-coding highlights the processes using the most system resources, so you can see them at a glance. Make Your Own Windows 8 Start Button with Zero Memory Usage Reader Request: How To Repair Blurry Photos HTG Explains: What Can You Find in an Email Header?

    Read the article

  • Now Available:Oracle Utilities Customer Self Service Version 2.1

    - by Roxana Babiciu
    The Oracle Utilities Global Business Unit is pleased to announce the general availability of Oracle Utilities Customer Self Service 2.1. It is ready for customers and partners to download and install via the Oracle Software Delivery Cloud. Key Features & Benefits: Oracle Utilities Customer Self Service 2.1 includes several new capabilities and enhancements including significantly improved Commercial Account Management and Advanced Notification Management using a new Oracle Utilities Notification Center module (licensed separately). These include the following: Advanced Notification Management Online Issues and Forms Management • Budget Management and Billing for Billed Budgets Prepaid User Dashboard Enhanced Usage Details Web Presentment Start/Stop/Transfer Service Automation Payment Arrangement Automation Account Sets Management for Large Commercial Customers Multiple Account Usage Data Aggregation, Comparison, and Data Download Multiple Account Financial History Mobile Outage Maps More information can be found on OPN

    Read the article

  • Should we consider code language upon design?

    - by Codex73
    Summary This question aims to conclude if an applications usage will be a consideration when deciding upon development language. What factors if any could be considered upon language writing could be taken into context. Application Type: Web Question Of the following popular languages, when should we use one or the other? What factors if any could be considered upon language writing could be taken into context. Languages PHP Ruby Python My initial thought is that language shouldn't be considered as much as framework. Things to consider on framework are scalability, usage, load, portability, modularity and many more. Things to consider on Code Writing maybe cost, framework stability, community, etc.

    Read the article

  • Why should I declare a class as an abstract class?

    - by Pied Piper
    I know the syntax, rules applied to abstract class and I want know usage of an abstract class Abstract class can not be instantiated directly but can be extended by other class What is the advantage of doing so? How it is different from an Interface? I know that one class can implement multiple interfaces but can only extend one abstract class. Is that only difference between an interface and an abstract class? I am aware about usage of an Interface. I have learned that from Event delegation model of AWT in Java. In which situations I should declare class as an abstract class? What is benefits of that?

    Read the article

  • Azure

    - by Grant Fritchey
    I've been tasked to learn SQL Azure, as well as test all the Red Gate products on it. My one, BIG, fear has been that I'll receive some mongo bill in the mail because I've exceeded the MSDN testing limit. I know people that have had that problem. I've been trying to keep an eye on my usage, but, let's face it, it's not something I think about every day. But now I don't have to. Red Gate has been working with Azure, long before I showed up. They already released a little piece of software that I just found out about, it's called CloudTally. It gathers your usage and sends you a daily email so you can know if you're starting to approach that limit. Check it out, it's free.

    Read the article

  • Store VOD wmi data in a database directly or use CQRS?

    - by JD01
    I need to collect Video on demand bandwidth usage every few minutes (or maybe ever few seconds) and store this in a database so users can produce graphs on bandwidth usage over a period of time (few hours, days, weeks or even possibly months). So the sort of data that will be stored will be the number of users watching videos, current server bandwidth (Mb/s), multicast bit rate etc. I am wondering whether using CQRS would be a good approach with Event sourcing as I can then rebuild my objects to create different projections (I.e. different graphs/reports etc) but then again it seems like I am introducing complexity which might not be needed. Or would it be best to just put the data directly in a database (currently using PostGres) directly and query off that? Having thought about it, my table is a form of audit log anyway, so I don't think I need event sourcing at all. Any thoughts?

    Read the article

  • Navigate Quickly with JustCode and Ctrl+Click

    Ctrl + Click is a widely used shortcut for Go To Definition in many development environments but not in Visual Studio. We, the JustCode team, find it really useful so we added it to Visual Studio. But we didn't stop there - we improved it even further. Read on to find the details. With JustCode you get an enhanced Go To Definition. By default you can execute it in the Visual Studio editor using one of the following shortcuts: Middle Click, Ctrl+Left Click, F12, Ctrl+Enter, Ctrl+B. The first usage of this feature is not much different from the default Visual Studio Go To Definition command use it where a member, type, method, property, etc is used to navigate to the definition of that item. For example, if you have this method:         public void Start()         {             lion = new Lion();             lion.Roar();         } If you hold Ctrl and click on the usage of the lion you will go to the lion member definition. If you hold Ctrl and click on the Lion you will go to the Lion class definition. What we added is the ability to easily find all the usages of the item you just navigated to. For example:     public class Lion     {         public void Roar()         {             Console.WriteLine("Rhaaaar");         }     }   If you hold Ctrl and click on the Lion definition you will see all the usages of the Lion type; if you click on the Roar method definition you will see all the usages of the Roar method: And if there is only one usage you will get automatically to that usage. In the examples I use C#, but it works also in VB.NET, JavaScript, ASP.NET and XAML. Why we like this feature? Let me first start with how the Ctrl+Click (or Go To Definition command) is used. We noticed that developers use it especially in what we call "code browsing sessions". In simple words this is when you browse around the code looking for a bug, just reading the code or searching for something. Sounds familiar? In our experience when you go to the definition of some item you often want to know more about it and the first thing you need is to find its usages. With JustCode this is just one click away. Why Ctrl+Click/Middle Click over F12/Ctrl+Enter/Ctrl+B? Actually you can use all of them. But during these "code browsing sessions" we noticed that most developers use the mouse. So the mouse is already in use and pressing Ctrl+Click (or the Middle Click) is so natural. During heavy coding sessions or if you are a keyboard type developer F12 (or any of the other keyboard shortcuts) is the key. We really use heavily this feature not only in our team but in the whole company. It saves us a bit of time many times a day. And it adds up. We hope you will like it too. Your feedback is more than welcome for us. P.S. If you dont want JustCode to capture the Ctrl+Click and the Middle Click in the editor, you can change that in JustCode->Options->General in the Navigation group. Keyboard shortcuts can be reassigned using the Visual Studio keyboard shortcuts editor.Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • What causes the iOS OpenGLES driver to allocate extra memory?

    - by Martin Linklater
    I'm trying to optimize the memory usage of our iOS game and I'm puzzled about when/why the iOS GLES driver allocates extra memory at runtime... When I run our game through Instruments with the OpenGL ES Driver instrument the gartUsedBytes value can fluctuate quite wildly. We preload all our textures and build the buffer objects up front, so it's not the game engine requesting extra memory from GL. Currently we are manually requesting around 50MB of GL memory, yet the gartUsedBytes value sits at around 90MB most of the time, peaking at 125MB from time to time. It seems to be linked to what you are rendering that frame - our PVS only submits VBO's for visible meshes. Can anyone shed some light on what the driver is doing in the background ? Like I said earlier, all our game engine allocations are done on level load, so in theory there shouldn't be any fluctuation on GL memory usage while the level is running. Thanks.

    Read the article

  • Using "prevent execution of method" flags

    - by tpaksu
    First of all I want to point out my concern with some pseudocode (I think you'll understand better) Assume you have a global debug flag, or class variable named "debug", class a : var debug = FALSE and you use it to enable debug methods. There are two types of usage it as I know: first in a method : method a : if debug then call method b; method b : second in the method itself: method a : call method b; method b : if not debug exit And I want to know, is there any File IO or stack pointer wise difference between these two approaches. Which usage is better, safer and why?

    Read the article

  • Web api authentication techniques

    - by Steve
    We have a asp.net MVC web service framework for serving out xml/json for peoples Get requests but are struggling to figure out the best way (fast, easy, trivial for users coding with javascript or OO languages) to authenticate users. It's not that our data is sensitive or anything, we just want users to register so we can have their email address to notify them of changes and track usage. In our previous attempt we had the username in the URI and would just make sure that username existed and increment db tables with usage. This was super basic but we'd notice people using demo as a username etc so we need it to be a little more sophisticated. What authentication techniques are available? What do the major players use/do.

    Read the article

  • RPG Item processing

    - by f00b4r
    I started working on an item system for my (first) game, and I'm having a problem conceptualizing how it should work. Since Items can produce a bunch of potentially non-standard actions (revive a character vs increasing some stat) or have use restrictions (can only revive if a character is dead). For obvious reasons, I don't want to create a new Item class for every item type. What is the best way to handle this? Should I make a handful of item types (field modifiers, status modifiers, )? Is it normal to script item usage? Could (should?) this be combined with the above mentioned solution (have a couple of different sub item types, make special case items usage scripted)? Thanks.

    Read the article

  • Get system info from C program?

    - by Hamid
    I'm writing a little program in C that I want to use to output some system stats to my HD44780 16x2 character display. The system I'll be working with is a Debian ARM system and, although irrelevant, the display is on the GPIO header.(The system is a Raspberry Pi). As an initial (somewhat unambitious) attempt, I'd like to start with something simple like RAM and CPU usage (I'm new to C). I understand that if I make external command calls I need to fork() and execve() (or some equiv that will let me return the results), what I would like to know is how I go about getting the information I want in a nice clean format that I can use. Surely I will not have to call (for e.g); free -h And then use awk or similar to chop out the piece I want? There must be a cleaner way? The question should be seen as more of a generic, what is best practice for getting info about the system in C (the RAM/CPU usage are just an initial example).

    Read the article

  • Ubuntu gets slower by the day

    - by Doug
    Ive noticed that Ubuntu has been getting slower and slower to boot, launch programs, etc. I installed 12.04 about 4 months ago,now 12.10, running on a quad-core Q8300 Intel, 4GB Ram, and an 80GB WD IDE drive. For some reason (ever since 11.04), Ive noticed after installation, the speed is good. The longer I have the OS installed, every bootup gets slower and slower, launching programs get slower, frame rates change radically(onboard GF9400 gets anywhere from 60fps down to 12 in worst cases). I would think maybe the HD is the issue, however I installed 11.10 on a 160GB SATA, and the same thing occurred. Looking at system resources, I'm holding steady at 1GB memory usage (I have 4GB, but it's actually showing 3.6GB, dunno why), no swap usage, and using right around 4% on cpu currently. HD capacity is only 28% used. Has anyone else ran into this issue? I love Ubuntu to death, but using other distros other than Ubuntu, I dont have this problem.

    Read the article

< Previous Page | 73 74 75 76 77 78 79 80 81 82 83 84  | Next Page >