Search Results

Search found 7756 results on 311 pages for 'hardware acceleration'.

Page 23/311 | < Previous Page | 19 20 21 22 23 24 25 26 27 28 29 30  | Next Page >

  • Smart card driven membership and door entry system

    - by Rob G
    I'm looking at putting in a smart card driven system at my local sports club (which doesn't have oodles of money), and since they're willing to pay for hardware, and I'm willing to do the technical setup, I was wondering if anyone had any experience in setting something like this up. Writing any software needed is not the problem, I've pretty much got that covered with various open source projects out there and custom code I'll write, but it's more the hardware side I'm not too sure about and I'm looking for advice from people out there. I'm sure there are numerous complications, but on the surface it looks fairly simple. I'd basically like to enable members to swipe/touch a smart card at the door to gain entry to the club, walk up to a touch screen PC and swipe/touch a card reader there to "login" to the system I create, which will allow them to book club facilities etc. I may even want that same card to then activate things like lights or music when they enter the room they've booked. Pretty Eutopian I know, but still, we'd like to get as close as we can. As I said, the software shouldn't be a problem, and on the hardware side, so far I'm looking at: All in one touch screen PC running Windows 7 or Ubuntu USB card reader (not sure which one to buy) Smart Cards (again, never bought these before) Door/lighting hardware that could be triggered (not sure here either) If anyone has any advice on implementing something like this - especially the items I'm not sure about above, and of course anything I've missed out that's crucial, I'd be most grateful. Recommended hardware that you've used for something like this would be fantastic!

    Read the article

  • La beta de Moonlight 4 se rapproche de Silverlight 4, l'implémentation open-source ajoute accélération matérielle et support du H.264

    La beta de Moonlight 4 se rapproche de Silverlight 4 Son implémentation open-source propose désormais accélération matérielle et support du H.264 Moonlight 4 vient de sortir en version beta. L'implémentation open-source de Silverlight propose à présent l'accélération matérielle (pour la prise en charge des vidéos et de la 3D par le GPU) ou le support du codec H.264. Avec cette version de développement, Moonlight intègre plusieurs nouveautés de Silverlight 4, notamment la prise en charge des APIs de Silverlight 3 et 4. Elle permet également de construire et de faire tourner des applications « hors du navigateur ». Néanmoins, cette beta ne propose pas toutes les foncti...

    Read the article

  • Who makes laptops for Ubuntu?

    - by Tim Lytle
    I'm looking for a laptop and would like to avoid the whole 'is this [specific configuration of hardware] compatible with Ubuntu?' process by finding a laptop manufactured with Ubuntu in mind. I know of system76, but are there any other manufacturers making laptops built to run a standard build of Ubuntu? I'm not counting Dell, as - from my experience - their 'Ubuntu' laptops/netbooks require their build, and because of that have their own set of compatibility issues. UPDATE: And as mentioned in the comments, Dell is no longer selling systems with Ubuntu to consumers.

    Read the article

  • Wireless problems on HP

    - by Sat93
    I'm not able to enable Wireless using the hardware switch on my HP ProBook4430s. Because of this the Enable Wireless option is greyd out and I cannot enable it. The greyd out option can be seen in the screenshot below. The results of iwconfig for my system are as follows, lo no wireless extensions. wlan0 IEEE 802.11bgn ESSID:off/any Mode:Managed Access Point: Not-Associated Tx-Power=off Retry long limit:7 RTS thr:off Fragment thr:off Power Management:off eth0 no wireless extensions. Also I tried to do the following, sudo ifconfig wlan0 up but I got an error as below, SIOCSIFFLAGS: Operation not possible due to RF-kill Also the result of sudo rfkill list all for my system is as follows, 0: phy0: Wireless LAN Soft blocked: no Hard blocked: yes 1: hp-wifi: Wireless LAN Soft blocked: no Hard blocked: no 2: hp-bluetooth: Bluetooth Soft blocked: no Hard blocked: no 3: hci0: Bluetooth Soft blocked: no Hard blocked: no How do I fix this problem? Thanku!

    Read the article

  • CSS3 : de l'accélération matérielle graphique pour les filtres sur Chromium, le W3C donne le feu vert au retrait des préfixes propriétaires

    CSS3 : de l'accélération matérielle graphique pour les filtres sur Chromium Le W3C donne le feu vert au retrait des préfixes propriétaires Deux, trois filtres CSS3 font freezer l'écran sur la plus récente version stable de votre navigateur ultra-moderne ? Et mettent en branle les ventilateurs de l'ordinateur dès qu'ils sont couplés à une pincée de transitions ? Vous vous dites que Flash a encore de beaux jours devant lui ? Ça commence à changer ! En tout cas pour les utilisateurs de Google Chrome et de son moteur de rendu Webkit. Le blog officiel de Chromium (le logiciel libre d'où est directement issu Chrome) nous annonce l'intégration de l'accélération matérielle graphiqu...

    Read the article

  • Ideal laptop specs for a Computer Science Masters student? [closed]

    - by Ayush
    I have a HP pavillion core 2 duo 2 GHz and 4 GB RAM, and it is painful to use this machine for any kind of coding. Eclipse (especially Juno) literally takes 5 minutes to load. And even after that, everything is lagy. Apart from school stuff, I also use my computer as a television. I watch Hulu, Netflix, YouTube etc in 720p, and this laptop gets hot as hell and the fans are loud enough to wake somebody up from deep sleep. I DON'T use my laptop for Gaming or Video/Photo Editing. I'm looking to buy a new laptop (in which most widely used IDEs would work smoothly and playing hi-def videos wouldn't be too much for the machine to handle) any suggestions (on hardware specs) would be greatly appreciated. Thanks

    Read the article

  • What is the history of why bytes are eight bits?

    - by DarenW
    What where the historical forces at work, the tradeoffs to make, in deciding to use groups of eight bits as the fundamental unit ? There were machines, once upon a time, using other word sizes, but today for non-eight-bitness you must look to museum pieces, specialized chips for embedded applications, and DSPs. How did the byte evolve out of the chaos and creativity of the early days of computer design? I can imagine that fewer bits would be ineffective for handling enough data to make computing feasible, while too many would have lead to expensive hardware. Were other influences in play? Why did these forces balance out to eight bits? (BTW, if I could time travel, I'd go back to when the "byte" was declared to be 8 bits, and convince everyone to make it 12 bits, bribing them with some early 21st Century trinkets.)

    Read the article

  • Can't start ubuntu 11.10, Stops at login screen!

    - by Martinpizza
    I have been trying to dual boot ubuntu with windows 7 via WUBI on my custom built pc but without success. When i start the computer i can choose windows or ubuntu i choose ubuntu and when i should get to the login screen the screen just stays purple/pink. Tried safe mode but cant get in there either. I have tried reinstall but did not work either. :( The second time i install ubuntu the screen was purple/pink and the screen was cut of so the left side was on the right side and right side on left side (hard to describe) Third time installing (The last time) it is just like the first time please help!!!! cant get nowhere without help i am kinda new at ubuntu and its creepy commands. Had ubuntu on my old computer. I think the problem is my hardware so here is my computer specs: Amd Fx 6100 Amd HIS Radeon HD 6950 ICEQ X Asus Sabertooth 990fx 1TB Harddrive I have no idea the name on it Please Help me!! Sorry for my bad English! :D

    Read the article

  • How many textures can usually I bind at once?

    - by Avi
    I'm developing a game engine, and it's only going to work on modern (Shader model 4+) hardware. I figure that, by the time I'm done with it, that won't be such an unreasonable requirement. My question is: how many textures can I bind at once on a modern graphics card? 16 would be sufficient. Can I expect most modern graphics cards to support that amount? My GTX 460 appears to support 32, but I have no idea if that's representative of most modern video cards.

    Read the article

  • How do you ensure consistent experience across multiple graphics cards (or even driver versions)?

    - by Grigory Javadyan
    So I was writing a simple 2D game with OpenGL and SDL and had this problem when there was awful tearing when running in windowed mode (even though I explicitly asked SDL_SetVideoMode to use double buffering). Didn't worry about it all too much because most of the time the game grabs the entire screen, windowed mode is just for debugging. Anyway, yesterday I updated my nVidia drivers and tearing disappeared, the game runs smooth and looks nice in windowed mode too. I can see how the problem may be in the graphics driver, but this leads to a question. Obviously, professional game developers have to deal with a lot of different hardware/software configurations. What are the techniques they use to make sure the game looks the roughly the same on different graphics cards or even the same model of graphics card, but with different driver versions?

    Read the article

  • HP mini 210 touchpad issues

    - by user4041
    Need help in getting the right mouse click to work on a HP mini 210-1015TU when using the touchpad. If I plug in a USB mouse, both left click and right click function as normal. Using the touchpad however I can only get the left click to work. Attempting to right click gives the result expected from a left click. As per some comments on a forum I added a file 11-touchpad.conf to /usr/share/X11/xorg.conf.d. I can provide further details if required. This made touchpad operation noticeably smoother but the problem with the right mouse click remains. Not a hardware problem as right clicked worked with 10.04 and still works with Windows 7 starter. 10.10 installed using wubi.

    Read the article

  • Right click doesn't work on HP mini 210 touchpad

    - by user4041
    Need help in getting the right mouse click to work on a HP mini 210-1015TU when using the touchpad. If I plug in a USB mouse, both left click and right click function as normal. Using the touchpad however I can only get the left click to work. Attempting to right click gives the result expected from a left click. As per some comments on a forum I added a file 11-touchpad.conf to /usr/share/X11/xorg.conf.d. I can provide further details if required. This made touchpad operation noticeably smoother but the problem with the right mouse click remains. Not a hardware problem as right clicked worked with 10.04 and still works with Windows 7 starter. 10.10 installed using wubi.

    Read the article

  • Binding hotkey for toggling wireless card state - System76 Bonp3

    - by user109076
    Recently, my wireless card shut off on my laptop (System76 Bonp3 running Ubuntu 12.04 LTS). Awhile back, my keyboard had a meeting with a cup of coffee, and my F11 key no longer works. The key binding for turning my card back on happens to be Fn + F11, so I cannot turn my wifi on. The only solution I can think of is to somehow change this to be bound to another key. I'm looking for the script that handles this particular hardware switch so I can bind it elsewhere.

    Read the article

  • Hyper-V Server 2012 with Zambezi AMD FX-Series - Hardware assisted virtualization not present

    - by Vazgen
    I'm trying to set up VDI across Windows Server 2012 VMs running on Hyper-V 2012. The wizard's compatibility check for the Virtualization Host server failed with "Hardware-assisted virtualization is not present on the server". I'm running an FX-8120 CPU and have the ASUS M5A97 motherboard. I know I'm supposed to enable No-Execute (Hyper-V Hardware Considerations) but I cannot find that or any other synonyms of it in my motherboards UEFI BIOS (NX, XD, EVP, XN... nothing). I found this: PAE/NX/SSE2 Support Requirement Guide for Windows 8 which in short says "Windows 8 and Windows Server 2012 requires that systems must have processors that support NX, and NX must be turned on for important security safeguards to function effectively and avoid potential security vulnerabilities." this leads me to believe NX is on by default if I was able to get this far and install Hyper-V 2012 and Windows Server 2012.. Also I tried to disable AVX in cmd with "bcdedit /set xsavedisable 1". Did not resolve My processor is Zambezi FX-8120 and also supports RVI/SLAT/other synonym: processor: Newegg Processor FX-8120 support proof: AMD Processors with Rapid Virtualization Indexing Required to Run Hyper-V in Windows 8 What's going on here? I bought this CPU specifically after I had the same problems with an older AMD Athelon II and made sure to buy one with AMD-V and RVI. Thank you

    Read the article

  • Cisco ASA Act as a Hardware Security Module?

    - by Derek
    Hello, We have a partner that is requiring us to get a HSM for a web application that we host for them. This is something new for us, we've always installed our SSL certificates on our web servers and never needed a hardware device. We currently have 2 Cisco ASA 5510 firewalls in an active/standby configuration. Both ASAs have a ASA-SSM-10 security module installed in them. The web application is a standard HTTPS webpage with no authentication required. I was wondering if we could use our Cisco ASAs to meet this requirement or if we'll have to buy another device. I was doing some searching and read about Cisco's clientless webvpn feature. It sounds like it might work, but I'm not sure. We basically want the ASA to handle the SSL and proxy the connection to our web servers. We do not want to prompt for a username or password to connect or show any portals, just display the web page. If the ASA cannot do this, does any one have any recommendations for network attached hardware security modules? We are using VMware vCenter, so we'd rather have an external device attached to the network, rather than buying HSM cards for every ESXi host. Thanks, Derek

    Read the article

  • Virtualization and best hardware sharing scenario for me

    - by azera
    Hello, Following this thread on super user, I now want to start installing all my vm on the hardware. As a remainder, i have a (powerful enough) server on which i want to install 3 OS: there is a debian (general dev testbed purposes), an ipcop (network control/firewall) and a freenas (local network file sharing). I'm wondering which scenario would be the best for me and if I will be able to share the hardware to do what i want; either a - install an hypervisor like the free vmware esx and all three vms in it, or b - install debian, and the other two running inside it with virtual box My need being that: the ipcop should handle all network traffic to the internet, meaning all traffic from my main computer but also all traffic from the other two vm the freenas shares should be accessible from the other two vm and my main computer too i don't really care about the debian access, i only need to access it from my main computer, not the other vms Will I need to install additionnal network cards for each vm or can they all share the same one happily ? (right now I have two, one linking the server to my router [which only ipcop is gonna use] and one linking it to my switch [which i would like all three to use]) As for harddrives, I was going to use 1 harddrive cut in 3 partitions to install all three OSes, then add to that the freenas drives, will it be correct ? Thanks a lot for anyone who can help me, this is kind of a vast area and I'm not sure which way to go at all

    Read the article

  • Linksys wireless router will not hardware reset.

    - by Jack M.
    Hello, all. I'm unable to make my router perform a hardware reset, and I cannot understand why. All was working well, except that my iPhone could not connect to the wireless. I found that the router was only allowing AES encryption on WPA2 Personal mode, so I upgraded the firmware. I updated the firmware to Ver.1.06.1, and everything went screwy. The router is no longer showing up in the WiFi list (as Linksys, or its previous network name). Wiring into the router gives me an IP address from my ISP (24.121.121.XXX). Attempting to do a hardware reset, but the power light never starts flashing and the router does not seem to reboot. My machine wired in is still online with no interruption in WoW. Pulling the power cord to force a reset returns it to the same state. I even went so far as to pull up my previous IP address (from DynDNS) and try to connect to that, but it wont even ping. What I'm trying to find out is: Did the new firmware fry the thing, or is there some way to fix this? Thanks in advance for any help.

    Read the article

  • Hardware needed to route between two networks over wireless

    - by AptDweller
    I recently rented an apartment about 100 yards from my brother's house. I have line of sight to his house and can pick up his home AP signal with one of my two laptops if I go out on my balcony (facing his house) or put the laptop by the window. The other laptop will sometimes see the SSID broadcast, but fails to connect, drops, etc. We would like to set up a persistent wireless connection between our homes. We would prefer each network be logically segmented as independent networks, but he will share his internet connection. I've got a bunch of tv shows saved to a NAS by my TiVO that I'd like to make available to him across the wireless link. My brother strongly prefers to not mess with his WAP at all. His network is running fine and is afraid to mess it up. I guess you could say he is "technologically declined". If we can get a reliable 11Mbps connection we will be satisfied. What hardware do I need to make this work? I was thinking a router with two wireless interfaces (external antennas) a wired interface, and a directional antenna mounted on my balcony facing his house. Can anyone recommend hardware to make this happen? Cheaper is better. I'll only be living in the area a year or two. I do have an old satellite TV antenna if that can be used to direct the signal.

    Read the article

  • Software/hardware to build video streaming server?

    - by Sasha Yanovets
    I am looking for a video streaming server solution, something like online TV server, with ability to make live broadcasts in the internet. What software could you recommend for that? What kind of hardware it should run on, should be there anything special? I am looking for a solution that could be scaled up to at least 1000 simultaneous users online with good resolution of video. I think it is good to have general answer on what direction to choose. But here more details on my specific case: I just looking for a solution almost from scratch. We have some video content that we've produced, but it is not delivered over internet yet. We do not tied to any particular vendor for now. We want to make 24 hours of steaming three 8 hour blocks with change of content every day. We want the ability to make regular live broadcasts. I guess we will need to have several options of streaming quality (low ~56 kb/s mid ~273 kb/s). Some terms just foreign to me (like play-truncation rate), if you could point out what parameters we should avare of, it would be great. Uplink to the internet is to be determined. We plan to start from something and scale up on the way. If you are already have some kind of media streaming server, just describe its configuration here (hardware, OS, software), peak number of concurrent users it serves. I think it could help people approaching this task.

    Read the article

  • Upgrade SQLServer 2008 hardware

    - by John
    Forgive me if I'm not able to be totally clear here. It is not intentional, I'm a senior level developer in a very small company having to act like a manager at the moment. Anyway, the story is that we have 2 older dell servers with SQL Server 2008 Standard in a "cluster". I put that in quotes because I'm still not 100% clear what that means. We have 2 brand new blade servers and want to move the existing databases to the new hardware. Ok, so here is the gotcha. We need to do this with little or no down time. I'm being told that we can evict the passive node, then pull in one of the new servers. But I'm also being told that this is a dangerous step because something could go wrong that would cause the cluster to fail and then we would be left with nothing because the active server would not be able to come back up. Does anyone have any thoughts on how to handle this? I'm being told that the only way to ensure success is to have at least a day of down time where we bring up a new cluster on the new hardware and then migrate the databases 1 by 1.

    Read the article

  • Developer hardware autonomy in a managed desktop environment [closed]

    - by Troy Hunt
    I’m looking for some feedback on how developer PCs are managed within environments that have a strict managed desktop policy (normally large corporations). For example, many corporate environments control the installation of software and the deployment of patches and virus updates through a centralised channel. This usually means also dictating the OS version and architecture (32 bit versus 64 bit) which will likely also mean standardised hardware configurations. I’m particularly interested in feedback from developers who work in this sort of environment but have a high degree of autonomy over their machines. This might mean choosing your own hardware vendor, OS type and version and perhaps how the machines are built and maintained. I have several specific questions: How do you satisfy the needs of security, governance etc whilst maintaining your autonomy? For example, how do you address concerns about keeping virus definitions and OS patches up to date? Do you have a process for gaining exemption from standard desktop builds and if so, what do you need to demonstrate in order to get this? How have you justified this need to the decision makers? Essentially, what is the benefit to your role as a developer by having this degree of autonomy? Thanks very much everyone. Update: There's a great post from Jean-Paul Boodhoo which addresses the developer tool component of the quesiton here: http://blog.jpboodhoo.com/TheFallacyOfTheStandardizedDeveloperMachineimage.aspx

    Read the article

  • Software/hardware to build video streaming server?

    - by Sasha Yanovets
    I am looking for a video streaming server solution, something like online TV server, with ability to make live broadcasts in the internet. What software could you recommend for that? What kind of hardware it should run on, should be there anything special? I am looking for a solution that could be scaled up to at least 1000 simultaneous users online with good resolution of video. I think it is good to have general answer on what direction to choose. But here more details on my specific case: I just looking for a solution almost from scratch. We have some video content that we've produced, but it is not delivered over internet yet. We do not tied to any particular vendor for now. We want to make 24 hours of steaming three 8 hour blocks with change of content every day. We want the ability to make regular live broadcasts. I guess we will need to have several options of streaming quality (low ~56 kb/s mid ~273 kb/s). Some terms just foreign to me (like play-truncation rate), if you could point out what parameters we should avare of, it would be great. Uplink to the internet is to be determined. We plan to start from something and scale up on the way. If you are already have some kind of media streaming server, just describe its configuration here (hardware, OS, software), peak number of concurrent users it serves. I think it could help people approaching this task.

    Read the article

  • Half of installed RAM is hardware reserved

    - by user968270
    After a rather arduous and convoluted series of problems that left me without a desktop for ~80 days, I've finally got the thing up and running, having replaced the power supply, motherboard, graphics card and CPU. Now, however, I'm experiencing the 'hardware reserved RAM' issue. Perhaps this is the exhaustion talking, but looking at the question that tends to get pointed to when this kind of topic gets locked as a duplicate hasn't helped. I have 16 GB of RAM installed in an MSi 970A-G46, which is spec'd for up to 32 GB of RAM. The BIOS recognizes that I have 16 GB installed, and the resource monitor also shows the whole 16 GB, only it shows 8 GB as hardware reserved. I've seen suggestions that it's an OS issue, but the particular installation of Windows 7 (64-bit) which I'm running on my boot drive is the same as the one that could actually access the 16 GB in my previous motherboard (MSi 870A-G54). I've updated my BIOS using the MSi Live Update tool and restarted the machine with no effect, and I cannot seem to locate any 'Memory Remapping' option as I've seen mentioned. I've physically swapped the RAM between the slots to no effect. I've unchecked the Maximum Memory box in the msconfig Boot tab's advanced options, also to no effect. These are my system's basic specifications OS: Windows 7 Home Premium (64-Bit) Motherboard: MSi 970A-G46 CPU: AMD FX-8150 Graphics Card: XFX Radeon HD 6870 Boot Drive: OCZ Agility 3 Storage Drive: Samsung Spinpoint F3 ST1000DM005/HD103SJ 1TB PSU: Thermaltake TR-2 TR600 600W ATX12V v2.3

    Read the article

  • Are Chromebooks the New Netbooks, and What Does That Mean?

    - by Chris Hoffman
    Netbooks — small, cheap, slow laptops — were once very popular. They fell out of favor — people bought them because they seemed cheap and portable, but the actual experience was lackluster. Most netbooks now sit unused. Windows netbooks have vanished from stores today, but there’s a new super-cheap laptop — the Chromebook. Chromebook sales numbers are impressive, but their usage statistics tell a different story. Are Chromebooks just the new netbook? The Problem With Netbooks Netbooks seemed appealing, especially in an age before tablets and lightweight ultrabooks. You could buy a netbook for $200 or so and have a portable device that let you get on the Internet. The name “netbook” spelled that out — it was a portable device for getting on the ‘net. They weren’t really that great. The original netbook was a lightweight Asus Eee PC that ran Linux alone and had a small amount of fast flash storage. Netbooks eventually ran heavier Windows XP operating systems — Windows Vista was out, but it was just too bloated to run on netbooks. Manufacturers added slow magnetic hard drives, bloatware, and even DVD drives! They couldn’t run most Windows software very well. The build quality was poor and their keyboards were tiny and cramped. People liked the idea of a lightweight device that let them get on the Internet and loved the cheap price, but the actual experience wasn’t great. Chromebook Sales Chromebook sales numbers seem surprisingly high. NPD reported that Chromebooks were 21% of all notebooks sold in the US in 2013. If you combine laptop and tablet sales into a single statistic, Chromebooks were 9.6% of all those devices sold. That’s 2/3 as many Chromebooks sold as iPads in the US! Of Amazon’s best-selling laptop computers, two of the top three are Chromebooks. These definitely look like successful products. Unlike netbooks, Chromebooks are taking off in a big way in the education market. Many schools are buying Chromebooks for their students instead of more expensive Windows laptops. They’re easier to manage and lock down than Windows laptops, but — more importantly for cash-strapped schools — they’re very cheap. Netbooks never had this sort of momentum in schools. Chromebook Usage Statistics Here’s where the rosy picture of Chromebooks starts to become more realistic. StatCounter’s browser usage statistics show how widely used different operating systems are. For example, Windows 7 has the highest share with 35.71% of web activity in April, 2014. The chart doesn’t even show Chrome OS at all, although there is an “Other” number near the bottom. Click the Download Data link to download a CSV file and we can view more detailed information. Chrome OS only accounted for 0.38% of web usage in April, 2014. Desktop Linux, which people often shrug at, accounted for 1.52% in the same month. To its credit, Chrome OS usage has increased. Chromebooks were widely mocked back in November, 2013 when the sales numbers came out. After all, they only accounted for 0.11% of web usage globally in November, 2013! But Chrome OS numbers have been improving: Nov, 2013: 0.11% Dec, 2013: 0.22% Jan, 2014: 0.31% Feb, 2014: 0.35% Mar, 2014: 0.36% Apr, 2014: 0.38% Chrome OS is climbing, but it’s definitely still in the “Other” category. It isn’t as high as we’d expect to see it with those types of sales numbers. Chromebooks vs. Netbooks Chromebooks are more limited devices than traditional PCs. You can do quite a few things, but you have to do it all using Chrome or Chrome apps. Most people won’t be enabling developer mode and installing a Linux desktop. You don’t have access to the powerful desktop software available for Windows and even Mac OS X. On the other hand, these Chromebooks are less compromised than netbooks in many ways. They come with a lightweight operating system designed for portable, mobile devices. They don’t come packed with any bloatware, like the bloatware you’ll find on competing Windows PCs and the original netbooks. They’re cheaper because the manufacturer doesn’t have to pay for a Windows license. There’s no need for antivirus software weighing the operating system down. They’re larger than the original netbooks, with many of them being 11.6-inches instead of the original 8-inch bodies many older netbooks came with. They have larger, more comfortable keyboards and fast solid-state storage. Really, Chromebooks are what netbooks wanted to be. People didn’t buy netbooks to use typical Windows software — they just wanted a lightweight PC. Of course, for many people, the real successor to netbooks is tablets. If all you want is a portable device to throw in a bag so you can get online, maybe a tablet is better. Where Does This Leave Chromebooks? So, are Chromebooks the new netbooks? It’s a bit early to answer that question. Chromebooks are definitely not out of the competition — their sales look good and their usage share is increasing. On the other hand, Chrome OS is still pretty far behind. They’re not catching fire like tablets did. Maybe netbooks were just before their time and Chromebooks were what they were always meant to be. Just as Microsoft’s Windows XP tablets failed, Windows XP netbooks also failed. Tablets took off with a more refined operating system on better hardware years later. “Netbooks” — or Chromebooks — are now taking off with a more purpose-built operating system on better hardware, too. It’s hard to count Chromebooks out because they provide a much better experience than netbooks ever did. If you’re one of the people who wants to use old Windows desktop apps on your portable laptop, you may think netbooks were better — but most people don’t want that. But maybe people either want a full desktop PC experience or a full mobile tablet experience. Is there a place for a laptop with a keyboard that can only view websites? We’ll have to wait and see. Image Credit: Kevin Jarret on Flickr, Clive Darra on Flickr, Sean Freese on Flickr

    Read the article

  • Why Is Vertical Resolution Monitor Resolution so Often a Multiple of 360?

    - by Jason Fitzpatrick
    Stare at a list of monitor resolutions long enough and you might notice a pattern: many of the vertical resolutions, especially those of gaming or multimedia displays, are multiples of 360 (720, 1080, 1440, etc.) But why exactly is this the case? Is it arbitrary or is there something more at work? Today’s Question & Answer session comes to us courtesy of SuperUser—a subdivision of Stack Exchange, a community-driven grouping of Q&A web sites. The Question SuperUser reader Trojandestroy recently noticed something about his display interface and needs answers: YouTube recently added 1440p functionality, and for the first time I realized that all (most?) vertical resolutions are multiples of 360. Is this just because the smallest common resolution is 480×360, and it’s convenient to use multiples? (Not doubting that multiples are convenient.) And/or was that the first viewable/conveniently sized resolution, so hardware (TVs, monitors, etc) grew with 360 in mind? Taking it further, why not have a square resolution? Or something else unusual? (Assuming it’s usual enough that it’s viewable). Is it merely a pleasing-the-eye situation? So why have the display be a multiple of 360? The Answer SuperUser contributor User26129 offers us not just an answer as to why the numerical pattern exists but a history of screen design in the process: Alright, there are a couple of questions and a lot of factors here. Resolutions are a really interesting field of psychooptics meeting marketing. First of all, why are the vertical resolutions on youtube multiples of 360. This is of course just arbitrary, there is no real reason this is the case. The reason is that resolution here is not the limiting factor for Youtube videos – bandwidth is. Youtube has to re-encode every video that is uploaded a couple of times, and tries to use as little re-encoding formats/bitrates/resolutions as possible to cover all the different use cases. For low-res mobile devices they have 360×240, for higher res mobile there’s 480p, and for the computer crowd there is 360p for 2xISDN/multiuser landlines, 720p for DSL and 1080p for higher speed internet. For a while there were some other codecs than h.264, but these are slowly being phased out with h.264 having essentially ‘won’ the format war and all computers being outfitted with hardware codecs for this. Now, there is some interesting psychooptics going on as well. As I said: resolution isn’t everything. 720p with really strong compression can and will look worse than 240p at a very high bitrate. But on the other side of the spectrum: throwing more bits at a certain resolution doesn’t magically make it better beyond some point. There is an optimum here, which of course depends on both resolution and codec. In general: the optimal bitrate is actually proportional to the resolution. So the next question is: what kind of resolution steps make sense? Apparently, people need about a 2x increase in resolution to really see (and prefer) a marked difference. Anything less than that and many people will simply not bother with the higher bitrates, they’d rather use their bandwidth for other stuff. This has been researched quite a long time ago and is the big reason why we went from 720×576 (415kpix) to 1280×720 (922kpix), and then again from 1280×720 to 1920×1080 (2MP). Stuff in between is not a viable optimization target. And again, 1440P is about 3.7MP, another ~2x increase over HD. You will see a difference there. 4K is the next step after that. Next up is that magical number of 360 vertical pixels. Actually, the magic number is 120 or 128. All resolutions are some kind of multiple of 120 pixels nowadays, back in the day they used to be multiples of 128. This is something that just grew out of LCD panel industry. LCD panels use what are called line drivers, little chips that sit on the sides of your LCD screen that control how bright each subpixel is. Because historically, for reasons I don’t really know for sure, probably memory constraints, these multiple-of-128 or multiple-of-120 resolutions already existed, the industry standard line drivers became drivers with 360 line outputs (1 per subpixel). If you would tear down your 1920×1080 screen, I would be putting money on there being 16 line drivers on the top/bottom and 9 on one of the sides. Oh hey, that’s 16:9. Guess how obvious that resolution choice was back when 16:9 was ‘invented’. Then there’s the issue of aspect ratio. This is really a completely different field of psychology, but it boils down to: historically, people have believed and measured that we have a sort of wide-screen view of the world. Naturally, people believed that the most natural representation of data on a screen would be in a wide-screen view, and this is where the great anamorphic revolution of the ’60s came from when films were shot in ever wider aspect ratios. Since then, this kind of knowledge has been refined and mostly debunked. Yes, we do have a wide-angle view, but the area where we can actually see sharply – the center of our vision – is fairly round. Slightly elliptical and squashed, but not really more than about 4:3 or 3:2. So for detailed viewing, for instance for reading text on a screen, you can utilize most of your detail vision by employing an almost-square screen, a bit like the screens up to the mid-2000s. However, again this is not how marketing took it. Computers in ye olden days were used mostly for productivity and detailed work, but as they commoditized and as the computer as media consumption device evolved, people didn’t necessarily use their computer for work most of the time. They used it to watch media content: movies, television series and photos. And for that kind of viewing, you get the most ‘immersion factor’ if the screen fills as much of your vision (including your peripheral vision) as possible. Which means widescreen. But there’s more marketing still. When detail work was still an important factor, people cared about resolution. As many pixels as possible on the screen. SGI was selling almost-4K CRTs! The most optimal way to get the maximum amount of pixels out of a glass substrate is to cut it as square as possible. 1:1 or 4:3 screens have the most pixels per diagonal inch. But with displays becoming more consumery, inch-size became more important, not amount of pixels. And this is a completely different optimization target. To get the most diagonal inches out of a substrate, you want to make the screen as wide as possible. First we got 16:10, then 16:9 and there have been moderately successful panel manufacturers making 22:9 and 2:1 screens (like Philips). Even though pixel density and absolute resolution went down for a couple of years, inch-sizes went up and that’s what sold. Why buy a 19″ 1280×1024 when you can buy a 21″ 1366×768? Eh… I think that about covers all the major aspects here. There’s more of course; bandwidth limits of HDMI, DVI, DP and of course VGA played a role, and if you go back to the pre-2000s, graphics memory, in-computer bandwdith and simply the limits of commercially available RAMDACs played an important role. But for today’s considerations, this is about all you need to know. Have something to add to the explanation? Sound off in the the comments. Want to read more answers from other tech-savvy Stack Exchange users? Check out the full discussion thread here.     

    Read the article

< Previous Page | 19 20 21 22 23 24 25 26 27 28 29 30  | Next Page >