Search Results

Search found 2327 results on 94 pages for 'quad monitors'.

Page 61/94 | < Previous Page | 57 58 59 60 61 62 63 64 65 66 67 68  | Next Page >

  • FreeNas running on ESXi - sometimes gets very slugish.

    - by Luma
    Hello everyone, I have a ESXi server (dual quad core, 8GB of DDR3 ram, 6x 1TB WD Blacks running in RAid 5 on the PErc 6/i controller. I have a 64bit freenas VM running, on this VM I keep about 200Gigs of stuff that my windows machines access. every now and then the throughput of this VM just dies, for example right now it can't even handle streaming a song and when I tried to transfer a folder the speed goes from 10-400KB/s. Might I add at this point that the ESXi box has dual gigabit network cards plugged into a good solid gigabit switch and other linux and windows VM's are just fine I have seen speeds over 90MB/s (frequently) The server still has ram left over (plenty actually) and cpu is very low (500-1000mhz) any ideas what could cause this? thanks. Luc

    Read the article

  • How do you enable multi-core virtualization in Windows 8 Pro?

    - by Greg B
    I've just got a new Dell Vostro 470 with a quad core (8 threads) i7 3770 and I'm trying to run virtual machines on it, which works fine, except if I want to assign multiple cores to a VM. I've checked the bios which states Intel Virtualization Technology [Enabled], but both Hyper-V and VirtualBox will only allow me to assign a single core. If I run the Intel Processor Identification Utility on the host OS it tells me that Intel Virtualization Technology isn't supported by the processor, but according to the Intel website, it is. So whats going on? Have Dell clipped the i7's wings? Is there some config in Windows I need to change?

    Read the article

  • How can I judge the suitability of modern processors for systems with specific CPU requirements?

    - by Iszi
    Inspired by this question: How do I calculate clock speed in multi-core processors? The answers in the above question do a fair job of explaining why a lower-speed multi-core processor won't necessarily perform at the same level as a higher-speed single-core processor. Example: 4*2=8, but a quad-core 2 GHz processor isn't necessarily as fast as a single-core 8 GHz processor. However, I'm having a hard time putting the information in those answers to practical use in my mind. Particularly, I want to know how it should be used to judge whether a given CPU is appropriate for an application with specific requirements. Example scenarios: An application has a minimum CPU requirement of 2.4 GHz dual-core. Another application has a minimum CPU requirement of 1.8 GHz single-core. For either of the above scenarios: Would a higher-speed processor with fewer cores, or a lower-speed processor with more cores, be equally sufficient? If so, how can we determine the appropriate processor speeds required for a given number of cores?

    Read the article

  • Metrics - A little knowledge can be a dangerous thing (or 'Why you're not clever enough to interpret metrics data')

    - by Jason Crease
    At RedGate Software, I work on a .NET obfuscator  called SmartAssembly.  Various features of it use a database to store various things (exception reports, name-mappings, etc.) The user is given the option of using either a SQL-Server database (which requires them to have Microsoft SQL Server), or a Microsoft Access MDB file (which requires nothing). MDB is the default option, but power-users soon switch to using a SQL Server database because it offers better performance and data-sharing. In the fashionable spirit of optimization and metrics, an obvious product-management question is 'Which is the most popular? SQL Server or MDB?' We've collected data about this fact, using our 'Feature-Usage-Reporting' technology (available as part of SmartAssembly) and more recently our 'Application Metrics' technology: Parameter Number of users % of total users Number of sessions Number of usages SQL Server 28 19.0 8115 8115 MDB 114 77.6 1449 1449 (As a disclaimer, please note than SmartAssembly has far more than 132 users . This data is just a selection of one build) So, it would appear that SQL-Server is used by fewer users, but more often. Great. But here's why these numbers are useless to me: Only the original developers understand the data What does a single 'usage' of 'MDB' mean? Does this happen once per run? Once per option change? On clicking the 'Obfuscate Now' button? When running the command-line version or just from the UI version? Each question could skew the data 10-fold either way, and the answers only known by the developer that instrumented the application in the first place. In other words, only the original developer can interpret the data - product-managers cannot interpret the data unaided. Most of the data is from uninterested users About half of people who download and run a free-trial from the internet quit it almost immediately. Only a small fraction use it sufficiently to make informed choices. Since the MDB option is the default one, we don't know how many of those 114 were people CHOOSING to use the MDB, or how many were JUST HAPPENING to use this MDB default for their 20-second trial. This is a problem we see across all our metrics: Are people are using X because it's the default or are they using X because they want to use X? We need to segment the data further - asking what percentage of each percentage meet our criteria for an 'established user' or 'informed user'. You end up spending hours writing sophisticated and dubious SQL queries to segment the data further. Not fun. You can't find out why they used this feature Metrics can answer the when and what, but not the why. Why did people use feature X? If you're anything like me, you often click on random buttons in unfamiliar applications just to explore the feature-set. If we listened uncritically to metrics at RedGate, we would eliminate the most-important and more-complex features which people actually buy the software for, leaving just big buttons on the main page and the About-Box. "Ah, that's interesting!" rather than "Ah, that's actionable!" People do love data. Did you know you eat 1201 chickens in a lifetime? But just 4 cows? Interesting, but useless. Often metrics give you a nice number: '5.8% of users have 3 or more monitors' . But unless the statistic is both SUPRISING and ACTIONABLE, it's useless. Most metrics are collected, reviewed with lots of cooing. and then forgotten. Unless a piece-of-data could change things, it's useless collecting it. People get obsessed with significance levels The first things that lots of people do with this data is do a t-test to get a significance level ("Hey! We know with 99.64% confidence that people prefer SQL Server to MDBs!") Believe me: other causes of error/misinterpretation in your data are FAR more significant than your t-test could ever comprehend. Confirmation bias prevents objectivity If the data appears to match our instinct, we feel satisfied and move on. If it doesn't, we suspect the data and dig deeper, plummeting down a rabbit-hole of segmentation and filtering until we give-up and move-on. Data is only useful if it can change our preconceptions. Do you trust this dodgy data more than your own understanding, knowledge and intelligence?  I don't. There's always multiple plausible ways to interpret/action any data Let's say we segment the above data, and get this data: Post-trial users (i.e. those using a paid version after the 14-day free-trial is over): Parameter Number of users % of total users Number of sessions Number of usages SQL Server 13 9.0 1115 1115 MDB 5 4.2 449 449 Trial users: Parameter Number of users % of total users Number of sessions Number of usages SQL Server 15 10.0 7000 7000 MDB 114 77.6 1000 1000 How do you interpret this data? It's one of: Mostly SQL Server users buy our software. People who can't afford SQL Server tend to be unable to afford or unwilling to buy our software. Therefore, ditch MDB-support. Our MDB support is so poor and buggy that our massive MDB user-base doesn't buy it.  Therefore, spend loads of money improving it, and think about ditching SQL-Server support. People 'graduate' naturally from MDB to SQL Server as they use the software more. Things are fine the way they are. We're marketing the tool wrong. The large number of MDB users represent uninformed downloaders. Tell marketing to aggressively target SQL Server users. To choose an interpretation you need to segment again. And again. And again, and again. Opting-out is correlated with feature-usage Metrics tends to be opt-in. This skews the data even further. Between 5% and 30% of people choose to opt-in to metrics (often called 'customer improvement program' or something like that). Casual trial-users who are uninterested in your product or company are less likely to opt-in. This group is probably also likely to be MDB users. How much does this skew your data by? Who knows? It's not all doom and gloom. There are some things metrics can answer well. Environment facts. How many people have 3 monitors? Have Windows 7? Have .NET 4 installed? Have Japanese Windows? Minor optimizations.  Is the text-box big enough for average user-input? Performance data. How long does our app take to start? How many databases does the average user have on their server? As you can see, questions about who-the-user-is rather than what-the-user-does are easier to answer and action. Conclusion Use SmartAssembly. If not for the metrics (called 'Feature-Usage-Reporting'), then at least for the obfuscation/error-reporting. Data raises more questions than it answers. Questions about environment are the easiest to answer.

    Read the article

  • Wireless drops on HP ENVY dv6 with RT3290 wireless, worked without problem prior to upgrading to Ubuntu 13.10, can it be fixed?

    - by Tim
    I have a HP ENVY dv6 Notebook PC with an AMD A10 quad core and RT3290 wireless. Since I upgraded from Ubuntu 13.04 to 13.10, the wireless connects, but then drops after a few minutes or longer, whether or not I am running openconnect to get through a VPN. If I attempt to run a remote X client (e.g. remote xterm) it drops. If I don't run an X client, it disconnects after a while, requiring a reload of the driver and reconnect. Wireless info... sudo lshw -c network *-network description: Wireless interface product: RT3290 Wireless 802.11n 1T/1R PCIe vendor: Ralink corp. physical id: 0 bus info: pci@0000:02:00.0 logical name: wlan0 version: 00 serial: 68:94:23:a7:09:cb width: 32 bits clock: 33MHz capabilities: pm msi pciexpress bus_master cap_list ethernet physical wireless configuration: broadcast=yes driver=rt2800pci driverversion=3.11.0-12-generic firmware=0.37 ip=192.168.1.115 latency=0 link=yes multicast=yes wireless=IEEE 802.11bgn resources: irq:55 memory:f0210000-f021ffff I have successfully built and installed the MediaTek driver with no luck on connecting, then the system hangs on reboot and I have to recover/undo the changes to boot successfully.

    Read the article

  • Ubuntu 12.04 64 bit "unable to find medium with live filesystem" AFTER normal install

    - by user88710
    So, I got a new computer (64 bit quad core yada yada). pulled my Ubuntu SSD drive from old machine, installed it into new machine. (my intention here is to have Ubuntu installed on the 120G SSD, Win7 on the main drive) downloaded 64 bit Ubuntu, burned it to a disk. rebooted with Live CD, installed Ubuntu to the SSD drive, had no problems rebooted again, got the grub menu, selected Ubuntu after a minute i got this - "unable to find medium with live filesystem" booting into windows, explorer doesnt even see the SSD. Device manager sees it though. I assume this is because its formatted with ext4. so, The liveCD saw the SSD just fine, installed fine, but when i try to boot ubuntu, i get the error above, heeellllpppp! UPDATE: small update. Windows did a software update that apparently wiped out my grub, so I guess grub was installed on the main drive. I reinstalled Ubuntu (again) on the SSD drive but, still no joy with booting from it. same error message as above.

    Read the article

  • OpenGL sprites and point size limitation

    - by Srdan
    I'm developing a simple particle system that should be able to perform on mobile devices (iOS, Andorid). My plan was to use GL_POINT_SPRITE/GL_PROGRAM_POINT_SIZE method because of it's efficiency (GL_POINTS are enough), but after some experimenting, I found myself in a trouble. Sprite size is limited (to usually 64 pixels). I'm calculating size using this formula gl_PointSize = in_point_size * some_factor / distance_to_camera to make particle sizes proportional to distance to camera. But at some point, when camera is close enough, problem with size limitation emerges and whole system starts looking unrealistic. Is there a way to avoid this problem? If no, what's alternative? I was thinking of manually generating billboard quad for each particle. Now, I have some questions about that approach. I guess minimum geometry data would be four vertices per particle and index array to make quads from these vertices (with GL_TRIANGLE_STRIP). Additionally, for each vertex I need a color and texture coordinate. I would put all that in an interleaved vertex array. But as you can see, there is much redundancy. All vertices of same particle share same color value, and four texture coordinates are same for all particles. Because of how glDrawArrays/Elements works, I see no way to optimise this. Do you know of a better approach on how to organise per-particle data? Should I use buffers or vertex arrays, or there is no difference because each time I have to update all particles' data. About particles simulation... Where to do it? On CPU or on a vertex processors? Something tells me that mobile's CPU would do it faster than it's vertex unit (at least today in 2012 :). So, any advice on how to make a simple and efficient particle system without particle size limitation, for mobile device, would be appreciated. (animation of camera passing through particles should be realistic)

    Read the article

  • UEFI - Linux Mint Boot from USB Doesn't work

    - by Joe Bennett
    I'm running Linux Mint (only OS other than in VirtualBox) and wanting to remove it. I've created a Live USB of Windows 8 using Windows 7 USB/DVD Download Tool - Yes, I know it says Windows 7 but I've been doing research and I am hearing from everywhere that it also works for Windows 8. The ISO was loaded on just fine (AFAIK) Computer came with Windows 8 pre-installed I have Safe boot and Fast boot disabled in the BIOS Settings I have USB as my first boot option I have tried both the USB 2.0 and 3.0 ports Yet, Mint is all that will boot up. Anybody have a similar issue? If it helps, the computer is a Toshiba Satellite S855D Laptop with an AMD APU quad-core processor (3 CPU, 1 GPU)

    Read the article

  • Sprite Sheets in PyGame?

    - by Eamonn
    So, I've been doing some googling, and haven't found a good solution to my problem. My problem is that I'm using PyGame, and I want to use a Sprite Sheet for my player. This is all well and good, and it would be too, if I wasn't using a Sprite Sheet strip. Basically, if you don't understand, I have a strip of 32x32 'frames'. These frames are all in an image, along side each other. So, I have 3 frames in 1 image. I'd like to be able to use them as my sprite sheet, and not have to crop them up. I have used an awesome, popular and easy-to-use game framework for Lua called LÖVE. LÖVE has these things called "Quads". They are similar to texture regions in LibGDX, if you know what they are. Basically, quads allow you to get parts of an image. You define how large a quad is, and you define parts of an image that way, or 'regions' of an image. I would like to do something similar to this in PyGame, and use a "for" loop to go through the entire image width and height and mark each 32x32 area (or whatever the user defines as their desired frame width and height) and store that in a list or something for use later on. I'd define an animation speed and stuff, but that's for later on. I've been looking around on the web, and I can't find anything that will do this. I found 1 script on the PyGame website, but it crashed PyGame when I tried to run it. I tried for hours trying to fix it, but no luck. So, is there a way to do this? Is there a way to get regions of an image? Am I going about this the wrong way? Is there a simpler way to do this? Thanks! :-)

    Read the article

  • Dell Studio XPS 16 Runs Hot

    - by dtbarne
    Specs: 1920x1080 i7 1.6 GHz quad core 6GB RAM 1GB ATI Radeon HD 6570M/5700 Series 500GB 7200rpm Hard Drive Love this laptop for many reasons, but it constantly runs hot (CPU is low 70s with basic tasks and often 80+ is not uncommon) and I'm finding it too much to deal with. The laptop feels very hot (almost too hot for a lap) and often gets so hot that the OS slows down or freezes altogether. I've tried cleaning it out and even replacing the thermal paste. I often use an external cooler, but it only helps 3-5 degrees and it's a pain to have to use. I've come to the conclusion that it just runs hot. I have two questions: What is to blame? The i7 processor, the gfx card, or is it just that this laptop has poor cooling? Does the Dell XPS 15 run cooler? I'm looking at replacing my current laptop, but I don't want to run into the same problem.

    Read the article

  • How do you get past the Analysis to Paralysis when working on a new project?

    - by Cape Cod Gunny
    I've been struggling with how to get my project going. I've got an old software package that is in need of desparate rewrite. I haven't compiled the source code since 2004. It still sells, it's stable but does require the “Run this program in compatibility mode for:” on a lot of the newer windows systems. It's also one of those hard coded 640 X 480 screen resolution programs. Yuck! I can't seem to get started with this rewrite. I'm constantly fiddling around with different things. I'll play around with different fluid layouts for a while. Then I start looking around at how the main menu should work/look. I quickly find out that there's this thing called "Cool Bars" and I'll spend hours playing with that. Then I start thinking about stuff like "Oh I need to make sure that the screen sizes are preserved so when the application gets relaunched it remebers how the screens were positioned." Which leads to what happens if they have two monitors? Which leads to what happens if they have a quad screen? Yikes it's got to stop. I have always been a slow starter. I think about stuff long and hard up front. This has always plagued me. Once I get my mind made up then bam... I'm off and running. I'm looking for advice from some other one-person software companies that can help someone like me get off to a quicker start?

    Read the article

  • Configuration Tuning for PostgreSQL 9.1 PostGIS 1.5 Ubuntu 12.04 Server

    - by Martin
    My server performance is poor. At times SSH, top, and other features or commands are very slow to respond, taking several seconds or more. A query that normally takes 5 minutes can sometimes take 30 minutes. The database is mostly being used to do a spatial query (grid and summarize) on approximately 500GB of stored data spread between 4 tables. Restarting the server works as a temporary fix, but cannot be used as a long term solution. Any suggestions for how to diagnose and solve my performance issues? Hardware and Configuration: 3.3 GHz Intel quad core i5 16 GB DDR3 RAM 6 TB software RAID 10 (6 x 2 TB drives) Ubuntu 12.04 64-bit Postgres 9.1 PostGIS 1.5

    Read the article

  • Computer won't start unless power is removed for ~5 minutes

    - by Paul Tarjan
    I have a fairly standard 2-year old desktop computer (quad-core intel, single hard drive, decent video card, 300W power supply) which recently started acting up. I'm not sure what the cause is, so hopefully you can help. Sometimes (once a week-ish) I press the power button and nothing happens. No blinking, no sounds, no nothing. If I remove the power cord (or flip the switch on the power supply) I hear a capacitor discharge. If I leave it in the "no power at all" state for about 5 minutes then I can put the plug back in and the computer works perfectly. What is the issue? What do you think I have to replace?

    Read the article

  • Computer won't start unless power is removed for ~5 minutes

    - by Paul Tarjan
    I have a fairly standard 2-year old desktop computer (quad-core intel, single hard drive, decent video card, 300W power supply) which recently started acting up. I'm not sure what the cause is, so hopefully you can help. Sometimes (once a week-ish) I press the power button and nothing happens. No blinking, no sounds, no nothing. If I remove the power cord (or flip the switch on the power supply) I hear a capacitor discharge. If I leave it in the "no power at all" state for about 5 minutes then I can put the plug back in and the computer works perfectly. What is the issue? What do you think I have to replace?

    Read the article

  • Issues connecting to WPA2 with User Authentication Mavericks?

    - by heinst
    I was on all the builds of the Mavericks beta and connecting to my University's network was fine. Then I upgraded to the public release and now I can't seem to connect to the internet. I can connect to other networks, but not my schools. Its a WPA2 network with a User Authentication. And my MacBook is a 2011? 2.2 GHz first gen i7 Quad Core with 8 GBs of RAM. Does anyone else have the same issue? Any tips on how to fix it? Thanks! heinst

    Read the article

  • AMD Processors and the Windows Phone 8 Emulator

    - by Aj Patel
    I would madly appreciate it if anyone in this community would help me with my question. The background is that I want to develop Windows Phone 8 applications but both of my current computer processors do not have Hardware Virtualization & Second Level Address Translation that are needed to run the Emulator. I have my eyes on an AMD computer g7-2243us (I like it because it has 1600x900 screen res). I looked up this Link that shows that this computer's AMD processor (Next Gen AMD Quad-Core A8-4500M Accelerated 1.9GHz up to 2.8GHz 4MB L2 Cache Processor) supports AMD-V Hardware Virtualization. So, will this computer be able to run the emulator? Thank you so much for your answers. I'm pretty sure it will run the emulator, but I just want to make sure before spending $400. Thank You all So Much.

    Read the article

  • Win 2003 STD network adapter always showing DHCP when in static IP configuration, + it loses the DNS

    - by Darragh
    Hi, I have a server that after the first configuration it was DHCP, now I have added it to our domain and in a static IP, however after a few moments it returns to DHCP but with only some of the IPv4 setting staying the same, It loses DNS for example. I'm not sure what is causing the problem but all I know is this started to happen after I added it to the domain, Would it be a domain policy? or the NIC drivers Spec; Dell M605 Blade server Windows 2003 STD SP1 Intel Xeon Quad core NIC: Dual embedded Broadcom NetXtreme IITM 5708 Gigabit Ethernet NIC w/ TOE

    Read the article

  • Javascript Canvas Drawing Efficiency

    - by jujumbura
    I have just recently started some experiments with game development in Javascript/HTML5, and so far it has been going pretty well. I have a simple test scene running with some basic input handling, and a hundred-ish drawImage() calls with a few transforms. This all runs great on Chrome, but unfortunately, it already chugs on Firefox. I am using a very large canvas ( 1920 x 1080 ), but it doesn't seem like I should be hitting my limit already. So on that note, I was hoping to ask a few questions: 1) What exactly is done on the CPU vs. the GPU in terms of canvas and drawImage()? I'm afraid the answer is probably "it depends on the browser", but can anybody give me some rules of thumb? I naively imagined that each drawImage call results in a textured quad on the GPU with the canvas effectively being a render target, but I'm wondering if I'm pretty far off base there... 2) I have seen posts here and there with people saying not to use the translate(), rotate(), scale() functions when drawing on the canvas. Am I adding a lot of overhead just by adding a translate() call, as opposed to passing in the x,y to drawImage()? Some people suggest using "transate3d", etc., which are CSS properties, but I'm not sure how to use them within a scene. Can they be used for animated sprites within a single canvas? 3) I have also seen a lot of posts with people mentioning that pre-building canvases and then re-using them is a lot faster than issuing all the individual draw calls again. I am guessing that my background should definitely be pre-built into a canvas, but how far should I take this? Should I maintain an individual canvas for each sprite, to cache all static image data when not animating? Thank you much for your advice!

    Read the article

  • PC reboots spontaenously: debugging tips

    - by aaron
    I swapped my core 2 duo for a quad core recently, and generally things run fine, but every now and then my computer just restarts. I don't even get a blue screen (Vista 32). Core temp isn't a problem. My thinking is that my power supply is inadequate, but I haven't been able to test that (one idea was to under clock the cpu to see if that helped, but going up in speed was the only simple thing to do in the BIOS) Two cases where I semi-consistanly get problems: - Borderlands windowed after some period of time (and some other games, but Borderlands does it pretty regularly) - watching a video (e.g. quicktime/vlc) and having another video running Another thought is non-cpu heat? Maybe the graphics card? Any thoughts on how to track this down appreciated. Thanks!

    Read the article

  • Freescale One Box Unboxing (then installing Java SE Embedded technology)

    - by hinkmond
    So, I get a FedEx delivery the other day... "What cool device could be inside this FedEx Overnight Express Large Box?" I was wondering... Could it be a new Linux/ARM target device board, faster than a Raspberry Pi and better than a BeagleBone Black??? Why, yes! Yes, it was a Linux/ARM target device board, faster than anything around! It was a Freescale i.MX6 Sabre Smart Device Board (SDB)! Cool... Quad Core ARM Cortex A9 1GHz with 1GB of RAM. So, cool... I installed the Freescale One Box OpenWRT Linux image onto its SD card and booted it up into Linux. But, wait! One thing was missing... What was it? What could be missing? Why, it had no Java SE Embedded installed on it yet, of course! So, I went to the JDK 7u45 download link. Clicked on "Accept License Agreement", and clicked on "jdk-7u45-linux-arm-vfp-sflt.tar.gz", installed the bad boy, and all was good. Java SE Embedded 7u45 on a Freescale One Box. Nice... Hinkmond

    Read the article

  • 11.10 liveCD black screen

    - by Shaun Killingbeck
    Attempting to install/try ubuntu 11.10 on my new laptop, using a liveCD (and tried USB). I get the purple screen (with the man/keyboard at the bottom) and after that the screen flashes bright white before going black. Ubuntu continues to load in the background, with login sound etc but the screen is off. I have tried as many different solutions as I could find including: using nomodestep, xforcevesa, i915.modeset=0 in boot options (seperately): varying consequences, but either I end up at a blinking cursor with no prompt, a command line (startx fails: no screen found), or the original blank screen again Tried booting from VirtualBox - it crashes at the same place the screen would go blank when using a CD/USB tried 11.04: I don't have this problem BUT when trying to install, I get a ubi-partman error 141 (possibly down to the three partitions that came on my laptop... not sure why HP needed there own separate partition for HP Tools...) Model: HP Pavillion DV6 6B08SA Processor: AMD Quad-Core A6-3410MX APU with Radeon HD 6545G2 Dual Graphics (1.6 GHZ 4 MB L2 cache ) Chipset: AMD RS880M Any help would be greatly appreciated. I just want to be able to partition the drive and install Ubuntu. I'm assuming the issue is graphics card related, although I have no confirmation of that. I have caught a glimpse of some output to do with pulseaudio and [fail], but I can't imagine why that would cause a screen problem and the sound definitely works anyway.

    Read the article

  • Is there any Mac Pro hardware parts that can ONLY be purchased thru Apple?

    - by bigp
    I'd like to know if I need to be concerned about any hardware parts that I should include in a brand new Mac Pro purchase, instead of trying to hunt it down on 3rd party vendors (or whitelist vendors / hardware suppliers). The main components I'm interested for "upgradeability" are: Processors (If starting with Two 2.4GHz Quad-Core "Westmere"); RAM (If starting with the least possible, which seems to be 6 x 1GB); Video Cards (If starting with one ATI Radeon HD 5770, can a 2nd one be purchased elsewhere?) Hard-Drives (Since these are mounted in specialized trays [if I'm not mistaking], are they also sold elsewhere? And can they be bought as SSDs?) Power Supply (Do I need to be concerned about this at all, or does it auto-adjust depending on the new component upgrades?) I just want to be sure by choosing a Mac Pro with lower component specifications that I can in fact purchase upgrade parts cheaper elsewhere. Thanks!

    Read the article

  • USB Ports In Wrong Mode, How To Use usbmodeswitch?

    - by user86872
    I haven't had access to my USB ports as media devices for a couple days now. I've been reading and researching everything I can find but I can't find a good guide for usbmodeswtich or usbms that I can decipher. The USB's are fine for power, but won't support my android phone as a media device, which is killing me because I use adb everyday, and won't support my plug and play mouse any longer. Not sure what caused the switch, though I think it may be related to the suspend issue I've read about, but the solutions in those threads I read also didn't work. Below is my system information and details. System: Ubuntu 12.04, 64-bit, Dedicated Machine Machine: HP-Pavillion g6 notebook, AMD A6 Quad Core Processor USBs used for: Cooling dock, Android Debug Bridge, Wireless Mouse Attempted Mod Probe, udev restart, unable to attempt lsusb due to my own lack of knowledge. :) Last Attempt Readout: ncandiano@ncandiano-HP-Pavilion-g6-Notebook-PC:~$ sudo modprobe -r usbhid && sleep 5 && sudo modprobe usbhid ncandiano@ncandiano-HP-Pavilion-g6-Notebook-PC:~$ sudo modprobe -r usb-storage ncandiano@ncandiano-HP-Pavilion-g6-Notebook-PC:~$ sudo modprobe usb-storage ncandiano@ncandiano-HP-Pavilion-g6-Notebook-PC:~$ sudo restart udev udev start/running, process 2624 ncandiano@ncandiano-HP-Pavilion-g6-Notebook-PC:~$ lsusb Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 003 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 004 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 002 Device 002: ID 0461:4de7 Primax Electronics, Ltd webcam Any help would be greatly appreciated!

    Read the article

  • How to improve Minecraft-esque voxel world performance?

    - by SomeXnaChump
    After playing Minecraft I marveled a bit at its large worlds but at the same time I found them extremely slow to navigate, even with a quad core and meaty graphics card. Now I assume Minecraft is fairly slow because: A) It's written in Java, and as most of the spatial partitioning and memory management activities happen in there, it would naturally be slower than a native C++ version. B) It doesn't partition its world very well. I could be wrong on both assumptions; however it got me thinking about the best way to manage large voxel worlds. As it is a true 3D world, where a block can exist in any part of the world, it is basically a big 3D array [x][y][z], where each block in the world has a type (i.e BlockType.Empty = 0, BlockType.Dirt = 1 etc.) Now, I am assuming to make this sort of world perform well you would need to: A) Use a tree of some variety (oct/kd/bsp) to split all the cubes out; it seems like an oct/kd would be the better option as you can just partition on a per cube level not a per triangle level. B) Use some algorithm to work out which blocks can currently be seen, as blocks closer to the user could obfuscate the blocks behind, making it pointless to render them. C) Keep the block object themselves lightweight, so it is quick to add and remove them from the trees. I guess there is no right answer to this, but I would be interested to see peoples' opinions on the subject. How would you improve performance in a large voxel-based world?

    Read the article

  • Do I really need Microsoft Updates?

    - by Tony Wong
    When I install a fresh copy of Windows XP Home (I bought it from the store.. not a copy), my PC rocks like lightening speed. But when I start installing all the updates, patches & less .NET 4.0 client (as the .NET 4.0 Client seems to bring machine to slow crawl). The PC starts to slow down.. like there are more resources to watch or something is happening in the background. So could I not get away with an awesome virus protector and an awesome firewall set-up and avoid all the patches? The machine I have is a quad 4, 4 GB RAM and 2.3 GHz process. Tons of room and the machine can run several applications at one time.. but when the updates happen.. it's s-l-o-w!

    Read the article

< Previous Page | 57 58 59 60 61 62 63 64 65 66 67 68  | Next Page >