Search Results

Search found 3327 results on 134 pages for 'thrift protocol'.

Page 77/134 | < Previous Page | 73 74 75 76 77 78 79 80 81 82 83 84  | Next Page >

  • Microeconomical simulation: coordination/planning between self-interested trading agents

    - by Milton Manfried
    In a typical perfect-information strategy game like Chess, an agent can calculate its best move by searching the state tree for the best possible move, while assuming that the opponent will also make the best possible move (i.e. Mini-max). I would like to use this approach in a "game" modeling economic activity, where the possible "moves" would be to buy or sell for a given price, and the goal, rather than a specific class of states (e.g. Checkmate), would be to maximize some function F of the agent's state (e.g. F(money, widget) = 10*money + widget). How to handle buy/sell actions that require coordination between both parties, at the very least agreement upon a price? The cheap way out would be to set the price beforehand, maybe based upon the current supply -- but the idea of this simulation is to examine how prices emerge when freely determined by "perfectly rational" agents. A great example of what I do not want is the trading algorithm in SugarScape -- paraphrasing from Growing Artificial Societies p101-102: when a pair of agents interact to trade, they each compute their internal valuations of the goods, then a bargaining process is conducted and a price is agreed to. If this price makes both agents better off, they complete the transaction The protocol itself is beautiful, but what it cannot capture (as far as I can tell) is the ability for an agent to pay more than it might otherwise for a good, because it knows that it can sell it for even more at a later date -- what appears to be called "strategic thinking" in this pape at Google Books Multi-Agent-Based Simulation III: 4th International Workshop, MABS 2003... to get realistic behavior like that, it seems one would either (1) have to build an outrageously-complex internal valuation system which could at best only cover situations that were planned for at compile-time, or otherwise (2) have some mechanism to search the state tree... which would require some way of planning future trades. Note: The chess analogy only works as far as the state-space search goes; the simulation isn't intended to be "zero sum", so a literal mini-max search wouldn't be appropriate -- and ideally, it should work with more than two agents.

    Read the article

  • Best peer-to-peer game architecture

    - by Dejw
    Consider a setup where game clients: have quite small computing resources (mobile devices, smartphones) are all connected to a common router (LAN, hotspot etc) The users want to play a multiplayer game, without an external server. One solution is to host an authoritative server on one phone, which in this case would be also a client. Considering point 1 this solution is not acceptable, since the phone's computing resources are not sufficient. So, I want to design a peer-to-peer architecture that will distribute the game's simulation load among the clients. Because of point 2 the system needn't be complex with regards to optimization; the latency will be very low. Each client can be an authoritative source of data about himself and his immediate environment (for example bullets.) What would be the best approach to designing such an architecture? Are there any known examples of such a LAN-level peer-to-peer protocol? Notes: Some of the problems are addressed here, but the concepts listed there are too high-level for me. Security I know that not having one authoritative server is a security issue, but it is not relevant in this case as I'm willing to trust the clients. Edit: I forgot to mention: it will be a rather fast-paced game (a shooter). Also, I have already read about networking architectures at Gaffer on Games.

    Read the article

  • That Tool is cURLy

    When you just use IE, Firefox or Chrome it can be easy to forget that HTTP is about more then just going to check the latest tech news at Engadget. It is a full and rich protocol, and a great way to experience that richness is the powerful command line utility cURL. cURL has a lot of options, but the syntax starts out simple. You can retrieve the contents of a web page with a simple curl http://blogs.claritycon.com/. The results should be the full text of the web page, tags and all. From there, you can use X to specify the HTTP verb to use, POST, PUT, DELETE, PATCH, etc and d to specify the payload of a POST or PUT. I have found cURL to be incredibly useful for two scenarios. First, as a good way to test basic web services. Second, while working a bit with CouchDB and another document based database, cURL has helped me learn more about RESTful APIs, including different verbs and response codes. cURL is a mainstay in our environments and programming languages precisely because it is simple, powerful and discoverable. I encourage more .NET developers to take a look, bask on the command line for a while and enjoy the plain text of the web. And this excellent logo:     -- Relevant Links -- Its not always the case with manuals, but the manual for cURL is quite useful: http://curl.haxx.se/docs/manual.html To make your command line look a little nicer (and more powerful) on Windows, check out Console and add some transparency effects: http://sourceforge.net/projects/console/Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • nvidia: switching dvi socket

    - by lurscher
    I have Ubuntu 10.10 x86_64 with Nvidia 9800 gt and Nvidia driver version 270.41.06 My video card has two DVI sockets, but I only use the single monitor configuration. Now, I think the main DVI socket might be busted, so I want to try to enable the other as the main one, however, I don't know how to achieve that. I tried just plugging the monitor in that socket but it won't auto-detect it (it would have been way too easy to just work). This is my xorg.conf: Section "ServerLayout" Identifier "Layout0" Screen 0 "Screen0" 0 0 InputDevice "Keyboard0" "CoreKeyboard" InputDevice "Mouse0" "CorePointer" Option "Xinerama" "0" EndSection Section "Files" EndSection Section "InputDevice" # generated from default Identifier "Mouse0" Driver "mouse" Option "Protocol" "auto" Option "Device" "/dev/psaux" Option "Emulate3Buttons" "no" Option "ZAxisMapping" "4 5" EndSection Section "InputDevice" # generated from default Identifier "Keyboard0" Driver "kbd" EndSection Section "Monitor" # HorizSync source: builtin, VertRefresh source: builtin Identifier "Monitor0" VendorName "Unknown" ModelName "AOC" HorizSync 31.5 - 84.7 VertRefresh 60.0 - 78.0 ModeLine "1080p" 172.8 1920 2040 2248 2576 1080 1081 1084 1118 -hsync +vsync Option "DPMS" EndSection Section "Device" Identifier "Device0" Driver "nvidia" VendorName "NVIDIA Corporation" BoardName "GeForce 9800 GT" EndSection Section "Screen" # Removed Option "metamodes" "1024x768 +0+0" Identifier "Screen0" Device "Device0" Monitor "Monitor0" DefaultDepth 24 Option "CustomEDID" "CRT-0:/home/charlesq/lg.bin" Option "TVStandard" "HD1080p" Option "TwinView" "0" Option "TwinViewXineramaInfoOrder" "CRT-0" Option "metamodes" "1080p +0+0" SubSection "Display" Depth 24 EndSubSection EndSection

    Read the article

  • How to keep background requests in sequence

    - by Jason Lewis
    I'm faced with implementing interfaces for some rather archaic systems, for handling online deposits to stored value accounts (think campus card accounts for students). Here's my dilemma: stage 1 of the process involves passing the user off to a thrid-party site for the credit card transaction, like old-school PayPal. Step two involves using a proprietary protocol for communicating with a legacy system for conducting the actual deposit. Step two requires that each transaction have a unique sequence number, and that the requests' seqnums are in order. Since we're logging each transaction in Postgres, my first thought was to take a number from a sequence in the DB, guaranteeing uniqueness. But since we're dealing with web requests that might come in near-simultaneously, and since latency with the return from the off-ste payment processor is beyond our control, there's always the chance for a race condition in the order of requests passed back to the proprietary system, and if the seqnums are out of order, the request fails silently (brilliant, right?). I thought about enqueuing the requests in Redis and using Resque workers to process them (single worker, single process, so they are processed in order), but we need to be able to give the user feedback as to whether the transaction was processed successfully, so this seems less feasible to me. I've tried to make this application handle concurrency well (as much as possible for a Ruby on Rails app), but now we're in a situation where we have to interact with a system that is designed to be single process, single threaded, and sequential. If it at least gave an "out of order" error, I could just increment (or take the next value off the sequence), but it's designed to fail silently in the event of ANY error. We are handling timeouts in a way that blocks on I/O, but since the application uses multiple workers (Unicorn), that's no guarantee. Any ideas/suggestions would be appreciated.

    Read the article

  • Compressing/compacting messages over websocket on Node.js

    - by icelava
    We have a websocket implementation (Node.js/Sock.js) that exchanges data as JSON strings. As our use cases grow, so have the size of the data transmitted across the wire. The websocket protocol does not natively offer any compression feature, so in order to reduce the size of our messages we'd have to manually do something about the serialisation. There appear to be a variety of LZW implementations in Javascript, some which confuses me on their compatibility for in-browser use only versus transmission across the wire due to my lack of understanding on low-level encodings. More importantly, all of them seem to take a noticeable performance drag when Javascript is the engine doing the compression/decompression work, which is not desirable for mobile devices. Looking instead other forms of compact serialisation, MessagePack does not appear to have any active support in Javascript itself; BSON does not have any Javascript implementation; and an alternative BISON project that I tested does not deserialise everything back to their original values (large numbers), and it does not look like any further development will happen either. What are some other options others have explored for Node.js?

    Read the article

  • Facebook Like javascript related to Time Spent Downloading a page Increase in GWT?

    - by donaldthe
    Hi, I installed the Facebook Like button Javascript version on my website on December 15th. Take a look at this report from Google Webmaster Central. Crawl stats Googlebot activity in the last 90 days The crawl stats are from Googlebot which as far as I know doesn't execute Javascript. Could the Facebook Like Javascript code, "The XFBML version" be related to large spike in Time spent downloading a page? (By the way the huge spike in November was caused by a mistake where every image request was getting a 301.) I'm not sure what caused the spike to go down by half somewhere in December. It may have been related to a faulty setting in web.config. I'm at a loss as to what I can do about this or even how to tell if this is my problem or Googlebots crawl problem. Here is the Facebook code I am using to create the like button. It is right after the opening body tag <div id="fb-root"></div> <script> window.fbAsyncInit = function() { FB.init({appId: 'xxxxx', status: true, cookie: true, xfbml: true}); }; (function() { var e = document.createElement('script'); e.async = true; e.src = document.location.protocol + '//connect.facebook.net/en_US/all.js'; document.getElementById('fb-root').appendChild(e); }()); ` and this creates the like box: <fb:like show_faces="false"></fb:like> If the Javascript can't be the problem any ideas on where to start looking would be appreciated.

    Read the article

  • Switching DVI socket

    - by lurscher
    I have Ubuntu 10.10 x86_64 with Nvidia 9800 gt and Nvidia driver version 270.41.06 My video card has two DVI sockets, but I only use the single monitor configuration. Now, I think the main DVI socket might be busted, so I want to try to enable the other as the main one, however, I don't know how to achieve that. I tried just plugging the monitor in that socket but it won't auto-detect it (it would have been way too easy to just work). This is my xorg.conf: Section "ServerLayout" Identifier "Layout0" Screen 0 "Screen0" 0 0 InputDevice "Keyboard0" "CoreKeyboard" InputDevice "Mouse0" "CorePointer" Option "Xinerama" "0" EndSection Section "Files" EndSection Section "InputDevice" # generated from default Identifier "Mouse0" Driver "mouse" Option "Protocol" "auto" Option "Device" "/dev/psaux" Option "Emulate3Buttons" "no" Option "ZAxisMapping" "4 5" EndSection Section "InputDevice" # generated from default Identifier "Keyboard0" Driver "kbd" EndSection Section "Monitor" # HorizSync source: builtin, VertRefresh source: builtin Identifier "Monitor0" VendorName "Unknown" ModelName "AOC" HorizSync 31.5 - 84.7 VertRefresh 60.0 - 78.0 ModeLine "1080p" 172.8 1920 2040 2248 2576 1080 1081 1084 1118 -hsync +vsync Option "DPMS" EndSection Section "Device" Identifier "Device0" Driver "nvidia" VendorName "NVIDIA Corporation" BoardName "GeForce 9800 GT" EndSection Section "Screen" # Removed Option "metamodes" "1024x768 +0+0" Identifier "Screen0" Device "Device0" Monitor "Monitor0" DefaultDepth 24 Option "CustomEDID" "CRT-0:/home/charlesq/lg.bin" Option "TVStandard" "HD1080p" Option "TwinView" "0" Option "TwinViewXineramaInfoOrder" "CRT-0" Option "metamodes" "1080p +0+0" SubSection "Display" Depth 24 EndSubSection EndSection

    Read the article

  • How do you run XBMC on nvidia dual screen and stop it from taking over the keyboard and mouse?

    - by Paul Swartout
    I have set up dual screen under Ubuntu 12.04. I have a GeForce 8500 GT and have used the nVidia control panel to set up dual screen in "Separate screen mode". Here's the resulting xorg.conf # nvidia-settings: X configuration file generated by nvidia-settings # nvidia-settings: version 295.33 (buildd@zirconium) Fri Mar 30 13:38:49 UTC 2012 Section "ServerLayout" Identifier "Layout0" Screen 0 "Screen0" 0 0 Screen 1 "Screen1" RightOf "Screen0" InputDevice "Keyboard0" "CoreKeyboard" InputDevice "Mouse0" "CorePointer" Option "Xinerama" "0" EndSection Section "Files" EndSection Section "InputDevice" # generated from default Identifier "Mouse0" Driver "mouse" Option "Protocol" "auto" Option "Device" "/dev/psaux" Option "Emulate3Buttons" "no" Option "ZAxisMapping" "4 5" EndSection Section "InputDevice" # generated from default Identifier "Keyboard0" Driver "kbd" EndSection Section "Monitor" # HorizSync source: edid, VertRefresh source: edid Identifier "Monitor0" VendorName "Unknown" ModelName "Maxdata/Belinea B1925S1W" HorizSync 31.0 - 83.0 VertRefresh 56.0 - 75.0 Option "DPMS" EndSection Section "Monitor" # HorizSync source: builtin, VertRefresh source: builtin Identifier "Monitor1" VendorName "Unknown" ModelName "CRT-1" HorizSync 28.0 - 55.0 VertRefresh 43.0 - 72.0 Option "DPMS" EndSection Section "Device" Identifier "Device0" Driver "nvidia" VendorName "NVIDIA Corporation" BoardName "GeForce 8500 GT" BusID "PCI:1:0:0" Screen 0 EndSection Section "Device" Identifier "Device1" Driver "nvidia" VendorName "NVIDIA Corporation" BoardName "GeForce 8500 GT" BusID "PCI:1:0:0" Screen 1 EndSection Section "Screen" Identifier "Screen0" Device "Device0" Monitor "Monitor0" DefaultDepth 24 Option "TwinView" "0" Option "metamodes" "CRT-0: nvidia-auto-select +0+0" SubSection "Display" Depth 24 EndSubSection EndSection Section "Screen" # Removed Option "metamodes" "CRT-1: 1280x768 +0+0" Identifier "Screen1" Device "Device1" Monitor "Monitor1" DefaultDepth 24 Option "TwinView" "0" Option "metamodes" "CRT-1: 1360x768_60 +0+0" SubSection "Display" Depth 24 EndSubSection EndSection All well and good and I have a nice blank XWindow displayed on my TV (the 2nd monitor). I then fire up XBMC from a terminal on the PC monitor using DISPLAY=:0.1 xbmc XBMC fires up quite nicely on the TV however I can no longer use the main PC monitor / mouse / keyboard as XBMC on the TV screen seems to have focus. I was hoping to have XBMC running on the TV and let the kids use the MCE remote whilst I get on with my work on the PC monitor. Does anyone have any idea how to overcome this? I'm presuming there's some xorg.conf fun and games needed but I've no idea where to start to be honest.

    Read the article

  • Solution for lightweight LAN peer discovering?

    - by DevilWithin
    I built a library for purely cross-platform programming. My games made with it run fine in Android , Pc, Linux, Mac etc. The networking capabilities are provided by ENET library, therefore all communication between my apps is not TCP or UDP compatible, but only in the custom protocol, even tough its based on the UDP ultimately. I don't think its possible to do what i want with ENET, thats why I ask here for help! Lets say I have the same game running in my Android phone, my laptop and my pc. They are all in the same wifi network, and therefore in a LAN, whether its Wifi hotspot(?) or the household router. I need each of those 3 peers to discover the other two in the network. This is meant only to find the IP of alive apps in the LAN network, to be able to host multiplayer games between them. I can only think of one effective way to do this, UDP broadcast, wait responses, but if that is the solution, i need something small, since its the only purpose of the implementation. Other way could be to try to connect to all IPs in the LAN address subrange, but I don't think the OS would be with me on this one :p Sorry for the long question!

    Read the article

  • Laptop Display Not Working

    - by etrask
    Hello, I just purchased this laptop. It is working fine but I want to use Xubuntu on it. I managed to get 10.04 installed and running but it did not recognize/use the wireless card. So I want to try 10.10. However, every time I try, the screen goes black and never comes back on. I have tried the Live CD and Alternate install CDs, for both Ubuntu and Xubuntu 10.10. After I select "Install Ubuntu" or "Try Ubuntu", the screen blacks out and never displays anything. So I deleted "quiet" and "splash" off the command line to see what messages would come up. A bunch of text flies by but it ends at: [time] TCP established hash table entries: 524288 (order: 11, 8388608 bytes) [time] TCP bind hash table entries 65536 (order: 8, 1048576 bytes) [time] TCP: Hash tables configured (established 524288 bind 65536) [time] TCP reno registered [time] UDP hash table entries: 2048 (order: 4, 65536 bytes) [time] UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes) [time] NET: Registered protocol family 1 And freezes here forever. I have tried nomodeset, vga=771, and xforcevesa with no progress. Is my video card simply not going to work with this distro? It seems strange that 10.04 would work fine but not 10.10. Any help is appreciated, thank you.

    Read the article

  • Tracking Outgoing Links With Google Analytics Events

    - by the_archer
    I've been trying to track clicks on external links on my website using the events tracking method. So I've got my Google Analytics code setup before body ends as shown below (note: quotes have been entitied by blogger, but it works fine): <script type='text/javascript'> var _gaq = _gaq || []; _gaq.push([&#39;_setAccount&#39;, &#39;UA-XXXXXXX-X#39;]); _gaq.push([&#39;_trackPageview&#39;]); (function() { var ga = document.createElement(&#39;script&#39;); ga.type = &#39;text/javascript&#39;; ga.async = true; ga.src = (&#39;https:&#39; == document.location.protocol ? &#39;https://ssl&#39; : &#39;http://www&#39;) + &#39;.google-analytics.com/ga.js&#39;; var s = document.getElementsByTagName(&#39;script&#39;)[0]; s.parentNode.insertBefore(ga, s); })(); </script> Now I wanted to track a link on the addthis.com follow widget. So there is a link of the type below to which following instructions from here I added the onclick event. <a addthis:url='http://feeds.feedburner.com/myfeedburnerlurl' onClick="_gaq.push(['_trackEvent', 'Subscription Clicks', 'RSS']);" class='addthis_button_rss_follow'/> I clicked on it a couple of times, left it for over a day now, but nothing shows up in google analytics events. It just says zero events. Here's a screenshot of the events page on GA: Could anybody help me? Am I doing anything wrong?

    Read the article

  • How to Set Up a Hadoop Cluster Using Oracle Solaris (Hands-On Lab)

    - by Orgad Kimchi
    Oracle Technology Network (OTN) published the "How to Set Up a Hadoop Cluster Using Oracle Solaris" OOW 2013 Hands-On Lab. This hands-on lab presents exercises that demonstrate how to set up an Apache Hadoop cluster using Oracle Solaris 11 technologies such as Oracle Solaris Zones, ZFS, and network virtualization. Key topics include the Hadoop Distributed File System (HDFS) and the Hadoop MapReduce programming model. We will also cover the Hadoop installation process and the cluster building blocks: NameNode, a secondary NameNode, and DataNodes. In addition, you will see how you can combine the Oracle Solaris 11 technologies for better scalability and data security, and you will learn how to load data into the Hadoop cluster and run a MapReduce job. Summary of Lab Exercises This hands-on lab consists of 13 exercises covering various Oracle Solaris and Apache Hadoop technologies:     Install Hadoop.     Edit the Hadoop configuration files.     Configure the Network Time Protocol.     Create the virtual network interfaces (VNICs).     Create the NameNode and the secondary NameNode zones.     Set up the DataNode zones.     Configure the NameNode.     Set up SSH.     Format HDFS from the NameNode.     Start the Hadoop cluster.     Run a MapReduce job.     Secure data at rest using ZFS encryption.     Use Oracle Solaris DTrace for performance monitoring.  Read it now

    Read the article

  • USB mouse pointer only moving horizontally on macbook 6.2 with 12.04

    - by Glyn Normington
    After installing Ubuntu 12.04 on a macbook pro 6.2, the touchpad and external USB mouse worked perfectly. After rebooting I can't get either touchpad or external USB mouse to work. Sometimes no mouse pointer is visible, but more often I can only move the mouse pointer horizontally five sixths of the way across the display (from the top left). I have uninstalled mouseemu. xinput list shows the USB mouse. xinput query-state for the USB mouse shows the following: ButtonClass button[1]=up ... button[16]=up ValuatorClass Mode=Relative Proximity=In valuator[0]=480 valuator[1]=2400 valuator[2]=0 valuator[3]=3 and re-issuing this command with the pointer at its right hand extreme displays the same except for: valuator[0]=1679 So the valuator[0] seems to be the x-coordinate of the pointer and the range of motion 480-1679 is indeed about five sixths of the display width (1440). valuator[1] is suspiciously large given the display height is 900. Perhaps this is a side-effect of having previously been using a dual monitor (although booting with that monitor connected does not help). There are other entries listed under xinput list: Virtual core XTEST pointer which seems stuck at position (840,1050). bcm5974 which seems stuck at position (837,6700). Removing the bcm5974 module using rmmod disables the toucpad as expected but does not fix the USB mouse problem. After adding the module back, it is stuck at position (840,1050) instead of (837,6700). /etc/X11/xorg.conf was generated by nvidia-settings and contains: Section "InputDevice" # generated from default Identifier "Mouse0" Driver "mouse" Option "Protocol" "auto" Option "Device" "/dev/psaux" Option "ZAxisMapping" "4 5" although I don't know how plausible these settings are. Any suggestions?

    Read the article

  • Is there an elegant way to track multiple domains under separate accounts with google analytics?

    - by J_M_J
    I have a situation where a content management system uses the same template for multiple websites with different domain names and I can't make a separate template for each. However, each website needs to be tracked with Google analytics. Would this be appropriate to track each domain like this by putting in some conditional code? And would this be robust enough not to break? Is there a more elegant way to do this? <script type="text/javascript"> var _gaq = _gaq || []; switch (location.hostname){ case 'www.aaa.com': _gaq.push(['_setAccount', 'UA-xxxxxxx-1']); break; case 'www.bbb.com': _gaq.push(['_setAccount', 'UA-xxxxxxx-2']); break; case 'www.ccc.com': _gaq.push(['_setAccount', 'UA-xxxxxxx-3']); break; } _gaq.push(['_trackPageview']); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; (document.getElementsByTagName('head')[0] || document.getElementsByTagName('body')[0]).appendChild(ga); })(); </script> Just to be clear, each website is a separate domain name and must be tracked separately, NOT different domains with same pages on one analytics profile.

    Read the article

  • Is there a way to make catalyst driver work in Trusty for the radeon hd4330?

    - by Laurent BERNABE
    Though official Catalyst software 13.1 is suitable for ati radeon hd4330, it can't be installed on Ubuntu 14.04 as it can't support Xorg = 7.6 As I need proprietary drivers for trusty, I would like to know if there is a way to bypass this limitation ? (For example by fetching driver sources) Here are some results from the terminal : $ Xorg -version X.Org X Server 1.15.1 Release Date: 2014-04-13 X Protocol Version 11, Revision 0 Build Operating System: Linux 3.2.0-37-generic x86_64 Ubuntu Current Operating System: Linux bordeaux80 3.13.0-27-generic #50-Ubuntu SMP Thu May 15 18:06:16 UTC 2014 x86_64 Kernel command line: BOOT_IMAGE=/boot/vmlinuz-3.13.0-27-generic root=UUID=4015e6f7-d11a-45fd-ac9b-5b6c7ab9eaa0 ro quiet splash vt.handoff=7 Build Date: 16 April 2014 01:36:29PM xorg-server 2:1.15.1-0ubuntu2 (For technical support please see http://www.ubuntu.com/support) Current version of pixman: 0.30.2 Before reporting problems, check http://wiki.x.org to make sure that you have the latest version. $ xrandr Screen 0: minimum 320 x 200, current 1366 x 768, maximum 8192 x 8192 LVDS connected primary 1366x768+0+0 (normal left inverted right x axis y axis) 353mm x 198mm 1366x768 60.0*+ 1280x720 59.9 1152x768 59.8 1024x768 59.9 800x600 59.9 848x480 59.7 720x480 59.7 640x480 59.4 VGA-0 disconnected (normal left inverted right x axis y axis) HDMI-0 disconnected (normal left inverted right x axis y axis) $ uname -rp 3.13.0-27-generic x86_64 $ glxinfo | grep OpenGL OpenGL vendor string: X.Org OpenGL renderer string: Gallium 0.4 on AMD RV710 OpenGL core profile version string: 3.1 (Core Profile) Mesa 10.1.0 OpenGL core profile shading language version string: 1.40 OpenGL core profile context flags: (none) OpenGL core profile extensions: OpenGL version string: 3.0 Mesa 10.1.0 OpenGL shading language version string: 1.30 OpenGL context flags: (none) OpenGL extensions: Regards

    Read the article

  • OAuth2 vs Public API

    - by Adam Tannon
    My understanding of OAuth (2.0) is that its a software stack and protocol to allow 2+ web apps to share information about a single end user. User A is a member of Site B and Site C; Site B wants to fetch some data from Site C about User A, and this is where OAuth steps in. So first off, if this assessment is incorrect, please begin by clarifying this for me and correcting me! Assuming I'm on the right track, then I guess I'm not seeing the need for OAuth to begin with (!). I'm sure I'm just not seeing the "forest through the trees" here, but the way I see it, couldn't Site C just expose a public API that Site B could use to fetch the same data (sans OAuth)? If Site C required user credentials to access the data, could this public API just use HTTPS for secure transport and require username/password as a part of each API call? Again, I'm sure I'm missing something, but I'm just not understanding why I would need OAuth when a secure, public API written and exposed by Site C seems more than capable of delivering what Site B needs regarding User A. In general, I'm looking for a set of guidelines to go by when deciding to choose between using OAuth for my web apps or just writing my own web service ( exposing public API). Thanks in advance!

    Read the article

  • how to setup duel monitors an xorg.conf

    - by MrMonty
    # nvidia-settings: X configuration file generated by nvidia-settings # nvidia-settings: version 295.33 (buildd@allspice) Fri Mar 30 15:25:24 UTC 2012 Section "ServerLayout" # Removed Option "Xinerama" "1" # Removed Option "Xinerama" "0" # Removed Option "Xinerama" "1" Identifier "Layout0" Screen 0 "Screen0" 0 0 InputDevice "Keyboard0" "CoreKeyboard" InputDevice "Mouse0" "CorePointer" Option "Xinerama" "0" EndSection Section "Files" EndSection Section "InputDevice" # generated from default Identifier "Mouse0" Driver "mouse" Option "Protocol" "auto" Option "Device" "/dev/psaux" Option "Emulate3Buttons" "no" Option "ZAxisMapping" "4 5" EndSection Section "InputDevice" # generated from default Identifier "Keyboard0" Driver "kbd" EndSection Section "Monitor" # HorizSync source: edid, VertRefresh source: edid Identifier "Monitor1" VendorName "Unknown" ModelName "Ancor Communications Inc VE247" HorizSync 30.0 - 83.0 VertRefresh 50.0 - 76.0 Option "DPMS" EndSection Section "Monitor" # HorizSync source: edid, VertRefresh source: edid Identifier "Monitor0" VendorName "Unknown" ModelName "Ancor Communications Inc VE247" HorizSync 30.0 - 83.0 VertRefresh 50.0 - 76.0 Option "DPMS" EndSection Section "Device" Identifier "Device1" Driver "nvidia" VendorName "NVIDIA Corporation" BoardName "Quadro FX 1500" BusID "PCI:1:0:0" Screen 1 EndSection Section "Device" Identifier "Device0" Driver "nvidia" VendorName "NVIDIA Corporation" BoardName "Quadro FX 1500" EndSection Section "Screen" Identifier "Screen1" Device "Device1" Monitor "Monitor1" DefaultDepth 24 Option "TwinView" "0" Option "TwinViewXineramaInfoOrder" "DFP-1" Option "metamodes" "DFP-1: 1280x1024 +0+0" SubSection "Display" Depth 24 EndSubSection EndSection Section "Screen" # Removed Option "TwinView" "0" # Removed Option "metamodes" "DFP-0: 1280x1024 +0+0" # Removed Option "TwinView" "1" # Removed Option "metamodes" "DFP-0: 1280x1024 +0+0, DFP-1: 1280x1024 +1280+0" # Removed Option "TwinView" "0" # Removed Option "metamodes" "DFP-0: 1280x1024 +0+0" Identifier "Screen0" Device "Device0" Monitor "Monitor0" DefaultDepth 24 Option "TwinView" "1" Option "TwinViewXineramaInfoOrder" "DFP-0" Option "metamodes" "DFP-0: 1280x1024 +0+0, DFP-1: 1280x1024 +1280+0; DFP-1: 1280x1024_60 +0+0" SubSection "Display" Depth 24 EndSubSection EndSection Section "Extensions" Option "Composite" "Disable" EndSection thats my file!

    Read the article

  • Doubt related to PHP Cookies

    - by Richa
    Hey guys! I have a doubt, I will appreciate if you can clear it . COOKIES What are cookies? When described as entities, which is how cookies are often referenced in conversation, you can be easily misled. Cookies are actually just an extension of the HTTP protocol. Specifically, there are two additional HTTP headers: Set-Cookie and Cookie.The operation of these cookies is best described by the following series of events: Client sends an HTTP request to server. Server sends an HTTP response with Set-Cookie: foo=bar to client. Client sends an HTTP request with Cookie: foo=bar to server. Server sends an HTTP response to client. Thus, the typical scenario involves two complete HTTP transactions. In step 2, the server is asking the client to return a particular cookie in future requests. In step 3, if the user’s preferences are set to allow cookies, and if the cookie is valid for this particular request, the browser requests the resource again but includes the cookie. Now my question is....... why you cannot determine whether a user’s preferences are set to allow cookies during the first request????

    Read the article

  • Not able to track traffic on subdomain using Google Analytics

    - by Steven
    I'm trying to track traffic for my sub-domain, but it's not happening. This is how it's set up. My partner has a domain called sub1.partner.com. This domain points to partner1.mydomain.com. The idea is that users think they are browsing my partners website, when they are in fact browsing pages on my server. My tracking code looks like this: var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-xxxxxxxx-x']); _gaq.push(['_setDomainName', '.mysite.com']); _gaq.push(['_trackPageview']); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); In Google analytics I've created a new account under my main account and called in partner1.mysite.com. On this account I have created a filter: Filter type: include Filter field: Host name Filter pattern: partner1.mysite.no Case sensetive: No What more can I try to track traffic on my subdomain? UPDATE Question 1 Is this line correct? _gaq.push(['_setDomainName', '.mysite.com']); Question 2 Is it correct that I have to add \ before any punctuations like so \. in filters?

    Read the article

  • x server startx error?

    - by Chris
    root@thewhitewox:~# startx X.Org X Server 1.7.6 Release Date: 2010-03-17 X Protocol Version 11, Revision 0 Build Operating System: Linux 2.6.24-29-server i686 Ubuntu Current Operating System: Linux thewhitewox 2.6.18-238.9.1.el5.028stab089.1 #1 SMP Thu Apr 14 14:06:01 MSD 2011 i686 Kernel command line: quiet Build Date: 20 October 2011 03:05:54PM xorg-server 2:1.7.6-2ubuntu7.10 (For technical support please see ww w.ubuntu . com/support) Current version of pixman: 0.16.4 Before reporting problems, check http: //wiki.x.org to make sure that you have the latest version. Markers: (--) probed, (**) from config file, (==) default setting, (++) from command line, (!!) notice, (II) informational, (WW) warning, (EE) error, (NI) not implemented, (??) unknown. (==) Log file: "/var/log/Xorg.0.log", Time: Tue Nov 15 00:50:32 2011 (==) Using config directory: "/usr/lib/X11/xorg.conf.d" Fatal server error: xf86OpenConsole: Cannot open /dev/tty0 (No such file or directory) Please consult the The X.Org Foundation support at http ://wiki.x.org for help. Please also check the log file at "/var/log/Xorg.0.log" for additional information. ddxSigGiveUp: Closing log What does this mean and how can I fix it?

    Read the article

  • Over-scan Issues when using HDTV through VGA

    - by RPG Master
    Right now all we can do is set the TV to 1280x768 instead of its native resolution of 1360x768. Setting it to its native resolution gives you a screen with a large portion of the left side of the screen cut off. We've tried everything with the TV so now we're turning to the innards of Ubuntu in hopes of fixing this. The computer is using an NVIDIA GeForce GT240. This is its current xorg.conf: # nvidia-settings: X configuration file generated by nvidia-settings # nvidia-settings: version 1.0 (buildd@palmer) Fri Apr 9 10:35:18 UTC 2010 Section "ServerLayout" Identifier "Layout0" Screen 0 "Screen0" 0 0 InputDevice "Keyboard0" "CoreKeyboard" InputDevice "Mouse0" "CorePointer" Option "Xinerama" "0" EndSection Section "Files" EndSection Section "InputDevice" # generated from default Identifier "Mouse0" Driver "mouse" Option "Protocol" "auto" Option "Device" "/dev/psaux" Option "Emulate3Buttons" "no" Option "ZAxisMapping" "4 5" EndSection Section "InputDevice" # generated from default Identifier "Keyboard0" Driver "kbd" EndSection Section "Monitor" # HorizSync source: builtin, VertRefresh source: builtin # HorizSync 28.0 - 55.0 # VertRefresh 43.0 - 72.0 Identifier "Monitor0" VendorName "Unknown" ModelName "CRT-0" HorizSync 28.0 - 55.0 VertRefresh 43.0 - 72.0 Option "DPMS" EndSection Section "Device" Identifier "Device0" Driver "nvidia" VendorName "NVIDIA Corporation" BoardName "GeForce 6600" EndSection Section "Screen" # Removed Option "metamodes" "1360x768 +0+0; 800x600 +0+0" Identifier "Screen0" Device "Device0" Monitor "Monitor0" DefaultDepth 24 Option "TwinView" "0" Option "TwinViewXineramaInfoOrder" "CRT-0" Option "metamodes" "1360x768 +0+0" SubSection "Display" Depth 24 EndSubSection EndSection

    Read the article

  • Xorg crashes ever since using Nvidia dual monitors

    - by legion
    Well ever since I have used dual monitors with the NVIDIA X Server Settings program my xorg process has been crashing after a while, and its generally a pretty long while of like 6 hours afterwards. Before NVIDIA changed my xorg.conf file I only had xorg crash like twice in 2 months, I can't figure out what is going on. I am running ubuntu 12.04 with the MATE desktop environment v 1.2.0 xorg.conf # nvidia-settings: X configuration file generated by nvidia-settings # nvidia-settings: version 295.33 (buildd@allspice) Fri Mar 30 15:25:24 UTC 2012 Section "ServerLayout" Identifier "Layout0" Screen 0 "Screen0" 0 0 InputDevice "Keyboard0" "CoreKeyboard" InputDevice "Mouse0" "CorePointer" Option "Xinerama" "0" EndSection Section "Files" EndSection Section "InputDevice" # generated from default Identifier "Mouse0" Driver "mouse" Option "Protocol" "auto" Option "Device" "/dev/psaux" Option "Emulate3Buttons" "no" Option "ZAxisMapping" "4 5" EndSection Section "InputDevice" # generated from default Identifier "Keyboard0" Driver "kbd" EndSection Section "Monitor" # HorizSync source: edid, VertRefresh source: edid Identifier "Monitor0" VendorName "Unknown" ModelName "LEN" HorizSync 51.8 - 55.8 VertRefresh 40.0 - 60.0 Option "DPMS" EndSection Section "Device" Identifier "Device0" Driver "nvidia" VendorName "NVIDIA Corporation" BoardName "NVS 3100M" EndSection Section "Screen" # Removed Option "TwinView" "0" # Removed Option "metamodes" "DFP-0: nvidia-auto-select +0+0" # Removed Option "TwinView" "1" # Removed Option "metamodes" "DFP-1: nvidia-auto-select +0+0, CRT: nvidia-auto-select +1920+0" Identifier "Screen0" Device "Device0" Monitor "Monitor0" DefaultDepth 24 Option "TwinViewXineramaInfoOrder" "DFP-1" Option "TwinView" "0" Option "metamodes" "DFP-0: nvidia-auto-select +0+0" SubSection "Display" Depth 24 EndSubSection EndSection

    Read the article

  • Wine causes twin view to break

    - by deanvz
    I have endlessly been playing around with the Nvidia X Server settings and changing my xorg.conf file to try and work for me and on most days its fine. In each instance I get it working for a while and then this morning the most bizarre thing happens. The moment I open any type of Wine program (which never use to be a problem) my Twin View setup disappears and am left with mirrored displays. I try and change the settings in the Nvidia driver, but its not interested and the screens remain mirrored. I have a work around. Restart my pc... Below are the contents of my current xorg.conf file. # nvidia-settings: X configuration file generated by nvidia-settings # nvidia-settings: version 295.33 (buildd@zirconium) Fri Mar 30 13:43:34 UTC 2012 Section "ServerLayout" Identifier "Layout0" Screen 0 "Screen0" 0 0 InputDevice "Keyboard0" "CoreKeyboard" InputDevice "Mouse0" "CorePointer" Option "Xinerama" "0" EndSection Section "Files" EndSection Section "InputDevice" # generated from default Identifier "Mouse0" Driver "mouse" Option "Protocol" "auto" Option "Device" "/dev/psaux" Option "Emulate3Buttons" "no" Option "ZAxisMapping" "4 5" EndSection Section "InputDevice" # generated from default Identifier "Keyboard0" Driver "kbd" EndSection Section "Monitor" # HorizSync source: edid, VertRefresh source: edid Identifier "Monitor0" VendorName "Unknown" ModelName "LG Electronics W1934" HorizSync 30.0 - 83.0 VertRefresh 56.0 - 75.0 Option "DPMS" EndSection Section "Device" Identifier "Device0" Driver "nvidia" VendorName "NVIDIA Corporation" BoardName "GeForce 9400 GT" EndSection Section "Screen" Identifier "Screen0" Device "Device0" Monitor "Monitor0" DefaultDepth 24 Option "TwinView" "1" Option "metamodes" "CRT-0: nvidia-auto-select +0+0, CRT-1: nvidia-auto-select +1440+0" SubSection "Display" Depth 24 EndSubSection EndSection

    Read the article

  • Multi-screen and Nvidia GT220

    - by Bohors
    Excuse me first for my broken English I have 3 monitors on two Nvidia GT220. Unity 2D works fine after using the proprietary drivers. But no way to use compiz effects. I'm on Ubuntu 11.10 (I also tested for curiosity 12.04 for the same result) I noticed in the "System Settings" / "details" that my graphics card is not recognized (graphics card "unknown") and "View" only returns me my first screen (with the correct resolution) also in unknown. $ lspci | grep VGA 01:00.0 VGA compatible controller: NVIDIA Corporation GT216 [GeForce GT 220] (rev a2) 05:00.0 VGA compatible controller: NVIDIA Corporation GT216 [GeForce GT 220] (rev a2) xorg.conf : Section "ServerLayout" Identifier "Layout0" Screen 0 "Screen0" 0 0 Screen 1 "Screen1" Below "Screen0" Screen 2 "Screen2" RightOf "Screen0" InputDevice "Keyboard0" "CoreKeyboard" InputDevice "Mouse0" "CorePointer" Option "Xinerama" "1" EndSection Section "Files" EndSection Section "InputDevice" # generated from default Identifier "Mouse0" Driver "mouse" Option "Protocol" "auto" Option "Device" "/dev/psaux" Option "Emulate3Buttons" "no" Option "ZAxisMapping" "4 5" EndSection Section "InputDevice" # generated from default Identifier "Keyboard0" Driver "kbd" EndSection Section "Monitor" # HorizSync source: edid, VertRefresh source: edid Identifier "Monitor0" VendorName "Unknown" ModelName "Acer V243HL" HorizSync 30.0 - 80.0 VertRefresh 55.0 - 75.0 Option "DPMS" EndSection Section "Monitor" # HorizSync source: edid, VertRefresh source: edid Identifier "Monitor1" VendorName "Unknown" ModelName "LG Electronics L1920P" HorizSync 30.0 - 83.0 VertRefresh 56.0 - 75.0 Option "DPMS" EndSection Section "Monitor" # HorizSync source: edid, VertRefresh source: edid Identifier "Monitor2" VendorName "Unknown" ModelName "Acer V243HL" HorizSync 30.0 - 80.0 VertRefresh 55.0 - 75.0 Option "DPMS" EndSection Section "Device" Identifier "Device0" Driver "nvidia" VendorName "NVIDIA Corporation" BoardName "GeForce GT 220" BusID "PCI:1:0:0" EndSection Section "Device" Identifier "Device1" Driver "nvidia" VendorName "NVIDIA Corporation" BoardName "GeForce GT 220" BusID "PCI:5:0:0" Screen 0 EndSection Section "Device" Identifier "Device2" Driver "nvidia" VendorName "NVIDIA Corporation" BoardName "GeForce GT 220" BusID "PCI:5:0:0" Screen 1 EndSection Section "Screen" Identifier "Screen0" Device "Device0" Monitor "Monitor0" DefaultDepth 24 Option "TwinView" "0" Option "TwinViewXineramaInfoOrder" "DFP-0" Option "metamodes" "1920x1080_60_0 +0+0; nvidia-auto-select +0+0" SubSection "Display" Depth 24 EndSubSection EndSection Section "Screen" Identifier "Screen1" Device "Device1" Monitor "Monitor1" DefaultDepth 24 Option "TwinView" "0" Option "metamodes" "CRT: nvidia-auto-select +0+0" SubSection "Display" Depth 24 EndSubSection EndSection Section "Screen" Identifier "Screen2" Device "Device2" Monitor "Monitor2" DefaultDepth 24 Option "TwinView" "0" Option "metamodes" "DFP: nvidia-auto-select +0+0" SubSection "Display" Depth 24 EndSubSection EndSection Section "Extensions" Option "Composite" "Disable" EndSection Have you any idea? thanks

    Read the article

< Previous Page | 73 74 75 76 77 78 79 80 81 82 83 84  | Next Page >