Search Results

Search found 14216 results on 569 pages for 'nvidia settings'.

Page 19/569 | < Previous Page | 15 16 17 18 19 20 21 22 23 24 25 26  | Next Page >

  • If Nvidia Shield can stream a game via WiFi (~150-300Mbps), where is the 1-10Gbps wired streaming?

    - by Enigma
    Facts: It is surprising and uncharacteristic that a wireless game streaming solution is the *first to hit the market when a 1000mbps+ Ethernet connection would accomplish the same feat with roughly 6x the available bandwidth. 150-300mbps WiFi is in no way superior to a 1000mbps+ LAN connection aside from well wireless mobility. Throughout time, (since the internet was created) wired services have **always come first yet in this particular case, the opposite seems to be true. We had wired internet first, wired audio streaming, and wired video streaming all before their wireless counterparts. Why? Largely because the wireless bandwidth was and is inferior. Even today despite being significantly better and capable of a lot more, it is still inferior to a wired connection. Situation: Chief among these is that NVIDIA’s Shield handheld game console will be getting a microconsole-like mode, dubbed “Shield Console Mode”, that will allow the handheld to be converted into a more traditional TV-connected console. In console mode Shield can be controlled with a Bluetooth controller, and in accordance with the higher resolution of TVs will accept 1080p game streaming from a suitably equipped PC, versus 720p in handheld mode. With that said 1080p streaming will require additional bandwidth, and while 720p can be done over WiFi NVIDIA will be requiring a hardline GigE connection for 1080p streaming (note that Shield doesn’t have Ethernet, so this is presumably being done over USB). Streaming aside, in console mode Shield will also support its traditional local gaming/application functionality. - http://www.anandtech.com/show/7435/nvidia-consolidates-game-streaming-tech-under-gamestream-brand-announces-shield-console-mode ^ This is not acceptable to me for a number of reasons not to mention the ridiculousness of having a little screen+controller unit sitting there while using a secondary controller and screen instead. That kind of redundant absurdity exemplifies how wrong of a solution that is. They need a second product for this solution without the screen or controller for it to make sense... at which point your just buying a little computer that does what most other larger computers do better. While this secondary project will provide a wired connection, it still shouldn't be necessary to purchase a Shield to have this benefit. Not only this but Intel's WiDi claims game streaming support as well - wirelessly. Where is the wired streaming? All that is required, by my understanding, is the ability to decode H.264 video compression and transmit control/feedback so by any logical comparison, one (Nvidia especially) should have no difficulty in creating an application for PC's (win32/64 environment) that does the exact same thing their android app does. I have 2 video cards capable of streaming (encoding) H.264 so by right they must be capable of decoding it I would think. I should be able to stream to my second desktop or my laptop both of which by hardware comparison are superior to the Shield. I haven't found anything stating plans to allow non-shield owners to do this. Can a third party create this software or does it hinge on some limitation that only Nvidia can overcome? Reiteration of questions: Is there a technical reason (non marketing) for why Nvidia opted to bottleneck the streaming service with a wireless connection limiting the resolution to 720p and introducing intermittent video choppiness when on a wired connection one could achieve, presumably, 1080p with significantly less or zero choppiness? Is there anything limiting developers from creating a PC/Desktop application emulating the same H.264 decoding functionality that circumvents the need to get an Nvidia Shield altogether? (It is not a matter of being too cheap to support Nvidia - I have many Nvidia cards that aren't being used. One should not have to purchase specialty hardware when = hardware already exists) Same questions go for Intel Widi also. I am just utterly perplexed that there are wireless live streaming solution and yet no wired. How on earth can wireless be the goto transmission medium? Is there another solution that takes advantage of H.264 video compression allowing live streaming over a wired connection? (*) - Perhaps this isn't the first but afaik it is the first complete package. (**) - I cant back that up with hard evidence/links but someone probably could. Edit: Maybe this will be the solution I am looking for but I still find it hard to believe that they would be the first and after wireless solutions already exist. In-home Streaming You can play all your Windows and Mac games on your SteamOS machine, too. Just turn on your existing computer and run Steam as you always have - then your SteamOS machine can stream those games over your home network straight to your TV! - http://store.steampowered.com/livingroom/SteamOS/

    Read the article

  • How can I add settings in the settings application for the iPhone

    - by Eytan
    I have my user preferences set to display in the system-settings application BUT I want to give them the option of adding additional settings. Specifically, the user has a group 'favorites' in the settings bundle and they then can put in the details of their three favorite contacts. However, I want to give the user the option of adding more. I know how to do this with inapp but can this be done through the settings application?

    Read the article

  • ubuntu mouse settings

    - by user1
    is there a way to import windows mouse settings to ubuntu 10.04. I really love ubuntu but my laptop touchpad is driving me crazy. for e.g. I can not seem to drag the window frame properly etc... It is not a driver issue I know this for sure. I think I am accustomed to windows mouse settings. Please if you have a setting that works really well can you share it.

    Read the article

  • Resolution stuck after playing OpenGL game

    - by kit.yang
    I used to start the game,Frozen Throne (using wine) with the option of "-opengl".When I entered the game,the resolution will changed,and restored after exit the game. But this time a problem happened.The resolution can't restore although I restart my computer several times. Both the Ubuntu pane and login windows are exceptional. nvidia-settingsalso detect the resolution is "1024 x 768",But it seemed useless using this tool. Screenshot-NVIDIA X Server Settings: the result of xrandr: Screen 0: minimum 320 x 240, current 1024 x 768, maximum 1024 x 768 default connected 1024x768+0+0 0mm x 0mm 1024x768 50.0* 800x600 51.0 52.0 53.0 680x384 54.0 55.0 640x480 56.0 576x432 57.0 512x384 58.0 400x300 59.0 60.0 61.0 320x240 62.0 the configure of /etc/X11/xorg.conf: # nvidia-settings: X configuration file generated by nvidia-settings # nvidia-settings: version 1.0 (buildd@yellow) Fri Apr 9 11:51:21 UTC 2010 Section "ServerLayout" Identifier "Layout0" Screen 0 "Screen0" 0 0 InputDevice "Keyboard0" "CoreKeyboard" InputDevice "Mouse0" "CorePointer" Option "Xinerama" "0" EndSection Section "Files" EndSection Section "InputDevice" # generated from default Identifier "Mouse0" Driver "mouse" Option "Protocol" "auto" Option "Device" "/dev/psaux" Option "Emulate3Buttons" "no" Option "ZAxisMapping" "4 5" EndSection Section "InputDevice" # generated from default Identifier "Keyboard0" Driver "kbd" EndSection Section "Monitor" # HorizSync source: builtin, VertRefresh source: builtin Identifier "Monitor0" VendorName "Unknown" ModelName "CRT-0" HorizSync 28.0 - 55.0 VertRefresh 43.0 - 72.0 Option "DPMS" EndSection Section "Device" Identifier "Device0" Driver "nvidia" VendorName "NVIDIA Corporation" BoardName "Entry Graphics" EndSection Section "Screen" # Removed Option "metamodes" "1024x768 +0+0" Identifier "Screen0" Device "Device0" Monitor "Monitor0" DefaultDepth 24 Option "TwinView" "0" Option "TwinViewXineramaInfoOrder" "CRT-0" Option "metamodes" "1024x768_60 +0+0" SubSection "Display" Depth 24 EndSubSection EndSection

    Read the article

  • Hybrid Graphics on Windows 7/Ubuntu 12.04 Dual Boot

    - by Noob.
    Alright, so here's the situation: I am using an ASUS UL80VT with two graphics cards: Integrated intel graphics and NVIDIA G210M I was running an Ubuntu 12.04 - Windows 7 dual boot (on separate partitions).The machine worked perfectly (including the display drivers) without me needing to install anything special or change any settings. However, my hard drive was corrupted and I lost all my data yesterday, so after it was replaced, I installed Ubuntu 12.04 64x again after installing Windows 7. I booted up Ubuntu after installation, and noticed it was by default using Unity 2D... Gnome 3.4 wasn't working properly either, so I guessed that the NVIDIA G210M driver wasn't installed/working and the OS was instead using the integrated graphics. I checked the "Additional Drivers" thing, but there were no proprietary drivers listed there, so I went to the NVIDIA website, downloaded the driver directly and installed it. I restarted, but there was no change. After this, I read somewhere that I should change my SATA in the BIOS to "Compatible" rather than "Enhanced". This worked fine and fixed the problem (both Unity and Gnome were working perfectly) but then when I tried booting up Windows 7, I recieved the BSOD. So I changed it back to Enhanced, and once again, the NVIDIA 210M graphics isn't working on Ubuntu, but on Windows 7 it is. I do not want to keep changing from Enhanced to Compatible every time I reboot to Ubuntu and neither do I want to simply just use one OS. Note that NVIDIA 210M and integrated graphics work perfectly on Windows 7. Also, I don't care about switching between them, I just want to be able to use the NVIDIA one. What can I do so that both Windows 7 and Ubuntu work and NVIDIA G210M works on Ubuntu?

    Read the article

  • How to make Unity 3D work with Bumblebee using the Intel chipset

    - by EboMike
    I have a Sony VAIO S laptop with the dreaded Optimus and finally managed to get Bumblebee to work fully on Ubuntu 12.04 so that I can utilize both the hardware acceleration of the Intel chipset as well as the Nvidia one via optirun and/or bumble-app-settings. However, the desktop effects don't work. But they should, I vaguely remember that they worked for a while before I had Bumblebee installed. This is what I get with the support test: :~$ /usr/lib/nux/unity_support_test -p Xlib: extension "NV-GLX" missing on display ":0". OpenGL vendor string: Tungsten Graphics, Inc OpenGL renderer string: Mesa DRI Intel(R) Ivybridge Mobile OpenGL version string: 1.4 (2.1 Mesa 8.0.2) Not software rendered: yes Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: yes GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: no GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: no First of all, I kind of doubt that the chipset doesn't support VBOs (essentially a standard feature in GL). Neither Xorg.0.log nor Xorg.8.log show any particular errors. As for the Nvidia drivers: In order to get them to work, I had to install the 304.22 drivers (older ones wouldn't work). They clobbered libglx.so, so I reinstated the xserver-xorg-core libglx.so in its original place, moved Nvidia's libglx.so to an nvidia-specific folder and specified that folder in the bumblebee.config. That seems to work and shouldn't cause the problem I see here. For fun, I tried to use the Nvidia chipset for Unity, but that didn't fly either: ~$ optirun /usr/lib/nux/unity_support_test -p OpenGL vendor string: NVIDIA Corporation OpenGL renderer string: GeForce GT 640M LE/PCIe/SSE2 OpenGL version string: 4.2.0 NVIDIA 304.22 Not software rendered: yes Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: no GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: no

    Read the article

  • Dual Monitor in Ubuntu 11.10 is resetting the theme

    - by Mengu
    i'm experiencing a strange problem with dual monitors. when i set the dual monitor via "nvidia settings" and save the setting to xorg.conf, the default unity theme and icons are turning back to gtk default. i also get an error telling me "could not apply the stored configuration for monitors" here is my xorg.conf: # nvidia-settings: X configuration file generated by nvidia-settings # nvidia-settings: version 280.13 (buildd@yellow) Fri Aug 5 12:31:28 UTC 2011 Section "ServerLayout" Identifier "Layout0" Screen 0 "Screen0" 0 0 InputDevice "Keyboard0" "CoreKeyboard" InputDevice "Mouse0" "CorePointer" Option "Xinerama" "0" EndSection Section "Files" EndSection Section "InputDevice" # generated from default Identifier "Mouse0" Driver "mouse" Option "Protocol" "auto" Option "Device" "/dev/psaux" Option "Emulate3Buttons" "no" Option "ZAxisMapping" "4 5" EndSection Section "InputDevice" # generated from default Identifier "Keyboard0" Driver "kbd" EndSection Section "Monitor" # HorizSync source: edid, VertRefresh source: edid Identifier "Monitor0" VendorName "Unknown" ModelName "Chi Mei Optoelectronics corp." HorizSync 30.0 - 75.0 VertRefresh 60.0 Option "DPMS" EndSection Section "Device" Identifier "Device0" Driver "nvidia" VendorName "NVIDIA Corporation" BoardName "GeForce GT 540M" EndSection Section "Screen" Identifier "Screen0" Device "Device0" Monitor "Monitor0" DefaultDepth 24 Option "TwinView" "1" Option "TwinViewXineramaInfoOrder" "DFP-0" Option "metamodes" "DFP: nvidia-auto-select +0+0, CRT: 1680x1050 +1920+0" SubSection "Display" Depth 24 EndSubSection EndSection here is an example of what i'm talking about: http://i.stack.imgur.com/vrlW1.png how can i fix this? thanks in advance.

    Read the article

  • New Nvidia Video Driver for Linux Supports X Server 1.8

    <b>Softpedia:</b> "Nvidia announced a few days ago, on its forum, a new version of its proprietary driver for the Nvidia graphics cards. Nvidia 195.36.24 adds support for new GPUs, and fixes a few issues. But the most important thing is that Nvidia 195.36.24 has support for X Server 1.8."

    Read the article

  • What to store at application Settings, numeric / string representations or objects?

    - by SoMoS
    Hello, I've been thinking for a while on what to store at the Project Settings, objects or numeric/string representations of those objects to set a rule and avoid thinking on this at the future so I want to take the best approach. On one side storing object representations grants you that what is stored is valid and saves you from doing conversions each time you access them. You only need objects with the attribute. At the other side storing the numeric/string representation of an object eases the editing of the setting because at the end the user will be entering numeric or string information. What do you do with this issue?

    Read the article

  • Storing user settings in table - how?

    - by Mdillion
    I have settings for the user about 200 settings, these include notice settings and tracing settings from user activities on objects. The problem is how to store it in the DB? Should each setting be a row or a column? If colunm then table will have 200 colunms. If row then about 3 colunms but 200 rows per user x even 10 million users = not good. So how else can i store all these settings? NOTE: these settings are a mix of text entry and FK lookups to other tables. Thanks.

    Read the article

  • blacklist VGA compatible controller: Intel Corporation 82845G/GL[Brookdale-G]/GE

    - by Thomas Labensi
    I have an hp a310n pavillion I have installed an nvidia pci geforce card I want to blacklist the VGA compa[Brookdale-G]/GE Chipset Integrated Graphics Device (rev 03)integrated graphics what do I need to do?? tom@tom-DM167A-ABA-a310n:~$ lspci | grep VGA 00:02.0 VGA compatible controller: Intel Corporation 82845G/GL[Brookdale-G]/GE Chipset Integrated Graphics Device (rev 03) 02:09.0 VGA compatible controller: NVIDIA Corporation NV11 [GeForce2 MX/MX 400] (rev b2) tom@tom-DM167A-ABA-a310n:~$ I'm using the nvidia via neuvoux and I want to really make sure I'm using the nvidia card

    Read the article

  • Xrandr settings don't stick; bash script fails; Nvidia; disper doesn't work

    - by bcmcfc
    I have an Nvidia graphics card. Yeah, I know. This might have to change soon. I am trying to get the resolution to set correctly on my second monitor, which is a VGA monitor with a native resolution of 1440x900. I have set up this bash script which doesn't work: xrandr --newmode "1440" 106.50 1440 1528 1672 1904 900 903 909 934 -hsync +vsync xrandr --addmode -display VGA-1 1440 xrandr --output VGA-1 --mode 1440 It outputs: xrandr: cannot find mode 1440. It has worked previously when running the commands individually but after a restart it fails. I've also just installed disper as per this question but have no idea how to use it - the help suggests the command: disper -d VGA-1 -r "1440x900" should work, but it does not. It just outputs the help text, suggesting something is wrong but not saying what. Is there any way to get these horrific Nvidia drivers working properly with a resolution that is not detected by the tools?

    Read the article

  • Which driver should I use with a 9600 GT Mobile?

    - by lisalisa
    I checked the Additional Drivers offered for my card and I saw four options: NVIDIA accelerated graphics drivers (version 173) NVIDIA accelerated graphics drivers (post-release updates) (version 173-updates) NVIDIA accelerated graphics drivers (version current) (Recommended) NVIDIA accelerated graphics drivers (post-release updates)(version current-updates) I'm not sure which one to pick. I want a driver that's stable but also takes advantage of my card's hardware.

    Read the article

  • What To Do w Driver Which is "Activated But Not Currently in Use"

    - by John
    Ubuntu 12.04, Graphics Card: GeForce-4 MX 420, From Nvidia Website, Downloaded NVIDIA-Linux-x86-96.43.18.pkg1.run Moved it from Downloads Folder to "Nvidia Folder" I searched for "Additional Drivers" I got a response that Nvidia-(Current) was "Activated But Not Currently in Use" 6a. Did I download it incorrectly? 6b. Did I Move, Open or Install it incorrectly? 6c. Want to update driver to improve rendering 6d. Not sure what to do next

    Read the article

  • C# Custom user settings class not saving

    - by Zenox
    I have the following class: [Serializable] [XmlRoot ( ElementName = "TextData", IsNullable = false)] public class TextData { private System.Drawing.Font fontColor; [XmlAttribute ( AttributeName = "Font" )] public System.Drawing.Font Font { get; set; } [XmlAttribute ( AttributeName = "FontColor" )] public System.Drawing.Color FontColor { get; set; } [XmlAttribute ( AttributeName = "Text" )] public string Text { get; set; } public TextData ( ) { } // End of TextData } // End of TextData And Im attempting to save it with the following code: // Create our font dialog FontDialog fontDialog = new FontDialog ( ); fontDialog.ShowColor = true; // Display the dialog and check for an ok if ( DialogResult.OK == fontDialog.ShowDialog ( ) ) { // Save our changes for the font settings if ( null == Properties.Settings.Default.MainHeadlineTextData ) { Properties.Settings.Default.MainHeadlineTextData = new TextData ( ); } Properties.Settings.Default.MainHeadlineTextData.Font = fontDialog.Font; Properties.Settings.Default.MainHeadlineTextData.FontColor = fontDialog.Color; Properties.Settings.Default.Save ( ); } Everytime I load the the application, the Properties.Settings.Default.MainHeadlineTextData is still null. Saving does not seem to take effect. I read on another post that the class must be public and it is. Any ideas why this would not be working properly?

    Read the article

  • How To Make NVIDIA’s Optimus Work on Linux

    - by Chris Hoffman
    Many new laptops come with NVIDIA’s Optimus technology – the laptop includes both a discrete NVIDIA GPU for gaming power and an onboard Intel GPU for power savings. The notebook switches between the two when necessary. However, this isn’t yet well-supported on Linux. Linus Torvalds had some choice words for NVIDIA regarding Optimus not working on Linux, and NVIDIA is now currently working on official support. However, if you have a laptop with Optimus support, you don’t have to wait for NVIDIA — you can use the Bumblebee project’s solution to enable Optimus on Linux today. Image Credit: Jemimus on Flickr How To Create a Customized Windows 7 Installation Disc With Integrated Updates How to Get Pro Features in Windows Home Versions with Third Party Tools HTG Explains: Is ReadyBoost Worth Using?

    Read the article

  • Using the “Settings.settings” functionalities in VB.NET can be tricky…

    - by Vincent Grondin
    Sometime you’re searching for something forever and when you find it, you realize it was right under your nose.  Maybe you were distracted by other things around… or maybe that thing right under your nose was so well hidden that it deserves a blog post…   That happened to me a few days ago while using the “Settings.settings” functionalities in my VB.NET application…  I thought it was a cool feature and I decided to use it…  So there I am adding new settings with “USER” scope and StringCollection as the data type, testing my application and everything works perfectly fine...  That was before I decided to modify the “Value” of one of my settings…  After changing the value of one of my settings, I start my application again and, to my surprise, my new values aren’t showing!  Hmmm… That’s odd…  My setting was a pretty long list of strings so I was rather angry at myself for not saving my work after I was done…  So I open up the Settings.setting in the designer and click the ellipsis symbol to enter my string collection again, but to my great pleasure (and disbelief) my strings are there!!!  Alright, you rock VB.NET!  You’ve just save me a bunch of typing time and I’m thinking it’s just a simple Visual Studio glitch…  I hit “Save” then “Save All” (just in case) and finally I rebuild everything and fire up my app once again.  Huh?  Where are my darn strings????????  Ok there’s a bug there…  I open up the app.config and my new strings are there!!!  Alright, let’s recap…  My new strings are in the app.config, they show correctly in the Settings.settings designer UI but they aren’t showing at runtime…  Hmmmm?  Let’s try something else…  Let’s start the application but outside Visual Studio this time… I fire up the exe and BAM!  My strings where there!  I “alt-tab” and hit “F5” and BOOM, no strings!  So it’s a bug in the Visual Studio environment… or could it be a FEATURE?  I must admit that I’m a little confused over what’s a bug and what’s a feature in Visual Studio… lol!   Finally I found out there’s a “cache” for your Visual Studio located here:  C:\Users\<your username>\AppData\Local\Microsoft\<your app name and a very weird temp ID>\<your app version>\user.config When using the “Settings.settings” with a setting of scope “user”, this file is out of sync with your app.config until you manually decide to update it… The button is right there… under your nose… at the top left corner of your screen in the settings designer…  See the big “Synchronize” button there?  Yep…  Now that’s user friendly isn’t it?  Oh, and wait until you see what it does when you click it…  It prompts you and basically says:  “Would you like your settings to start working inside Visual Studio now that you found out that I exist?” and of course the right answer is yes… or rather “OK”…  Unfortunately, you have to do this every time you edit a value… On the other hand, adding and removing settings seem to work flawlessly without having to click this magical button… go figure!  Oh and I almost forgot… this great “feature” is only available for VB.NET…  A project in C# using Settings.settings will work perfectly EVEN when editing values… Here’s a screenshot that shows this important button: Button Using other data types appears to work perfectly well…   Maybe it’s simply related to the StringCollection data type?  If you are a VB.NET programmer, you should pay attention to this when you plan on using the settings functionalities and your scope is “user” and your data type is StringCollection… Happy coding all!

    Read the article

  • Did "sudo dconf reset -f /org/compiz". Now ccsm settings ignored

    - by bshanks
    I executed: sudo dconf reset -f /org/compiz Now changing settings in CompizConfig Settings Manager (ccsm) has no effect. For example, changing the number of desktops has no effect. I tried purging and reinstalling ccsm but it didn't help. Incidentally, where is the documentation for this sort of thing (which documentation specifies where Unity and Compiz store config settings?)? And where does dconf store things? And where is dconf's documentation? http://dag.wieers.com/home-made/dconf/dconf.1.html says nothing about the 'reset' command, and it says it stores things in /var/log/dconf, but nothing was there. Are there two things named 'dconf'? I would actually like to just put things back to where they were before I executed sudo dconf reset. I have a backup of my hard drive available, I just need to know which files to rollback. I tried rolling back the .config, .gconf, and .cache directories, to no avail. I'm using 12.04.

    Read the article

  • Is it safe to have NVidia graphics always on on a Linux laptop, or do I risk overheating?

    - by codeape
    I'm getting a Lenovo T520 with two graphics cards: Integrated Intel HD 3000 Discrete NVidia NVS 4200M In BIOS, I can adjust which card(s) to use: Integrated only Discrete only Both (NVidia optimus) Since optimus is not well supported under Linux, I wonder if it is OK to set up the system to use the NVidia card all the time. I have read somewhere that a laptop risks overheating if using a discrete graphics card all the time. Is this true? Does someone have any experience to share?

    Read the article

  • If Nvidia Shield can stream a game via wifi, why can I not do the same via ethernet to any other PC?

    - by Enigma
    I think it absurd that a wireless game streaming solution is the *first to hit the market when a 1000mbps+ Ethernet connection would accomplish the same feat with roughly 6x the available bandwidth. I can only assume that there must be some reason behind this or a limitation preventing this, but what? 150mbps wifi is in no way superior to a 1000mbps LAN connection aside from well wireless mobility. Not only that but I have a secondary laptop and desktop which should by hardware comparison completely outperform anything the Tegra in the Nvidia Shield can do. Is this all just a marketing scheme to force people to buy the shield for the streaming benefit? Chief among these is that NVIDIA’s Shield handheld game console will be getting a microconsole-like mode, dubbed “Shield Console Mode”, that will allow the handheld to be converted into a more traditional TV-connected console. In console mode Shield can be controlled with a Bluetooth controller, and in accordance with the higher resolution of TVs will accept 1080p game streaming from a suitably equipped PC, versus 720p in handheld mode. With that said 1080p streaming will require additional bandwidth, and while 720p can be done over WiFi NVIDIA will be requiring a hardline GigE connection for 1080p streaming (note that Shield doesn’t have Ethernet, so this is presumably being done over USB). Streaming aside, in console mode Shield will also support its traditional local gaming/application functionality. - http://www.anandtech.com/show/7435/nvidia-consolidates-game-streaming-tech-under-gamestream-brand-announces-shield-console-mode ^ This is not acceptable for me for a number of reasons not to mention the ridiculousness of having a little screen+controller unit sitting there while using a secondary controller and screen instead. That kind of redundant absurdity exemplifies how wrong of a solution that is. They need a second product for this solution without the screen or controller for it to make sense... at which point your just buying a little computer that does what most other larger computers do better. All that is required, by my understanding, is the ability to decode H.264 video compression and transmit control/feedback so by any logical comparison, one (Nvidia especially) should have no difficulty in creating an application for PC's (win32/64 environment) that does the exact same thing their android app does. I have 2 video cards capable of streaming (encoding) H.264 so by right they must be capable of decoding it I would think. I haven't found anything stating plans to allow non-shield owners to do this. Can a third party create this software or does it hinge on some limitation that only Nvidia can overcome? (*) - perhaps this isn't the first but afaik it is the first complete package.

    Read the article

  • Settings.bundle Not Appearing on Application Updates

    - by coneybeare
    I have an app that does not currently use a Setting.bundle to display setting in the iPhone Settings app. I am releasing an update that does. On a fresh install, the settings are added to the Setting App as expected, but upon updating from the old install, the bundle is not added. Is there some sort of trick to get the Settings App to show my Settings.bundle?

    Read the article

< Previous Page | 15 16 17 18 19 20 21 22 23 24 25 26  | Next Page >