Search Results

Search found 80052 results on 3203 pages for 'data load performance'.

Page 275/3203 | < Previous Page | 271 272 273 274 275 276 277 278 279 280 281 282  | Next Page >

  • How is load balancing in big systems implemented?

    - by uther-lightbringer
    Hello, I'm wondering how is implemented load balancing in realy big applications like google or facebook. I know that in normal scenario there may be machine dedicated to this task, but I would like to know how is it resolved in realy big aplication with hundreds of thousans people accessing it in any given time. I am just wondering how exactly when one types google.com will that request find its way to concrete computer (are there multiple load balancers? and how is it set up and implemented that user's request will find the way to concrete balancer out of many others). I will realy appreciate if someone enlightens me this issue, thank you.

    Read the article

  • Using .dll methods to load data from file in C# code

    - by Espinas.iss
    I want to use in C# these methods: * int LibRaw::open_datastream(LibRaw_abstract_datastream *stream) * int LibRaw::open_file(const char *rawfile) * int LibRaw::open_buffer(void *buffer, size_t bufsize) * int LibRaw::unpack(void) * int LibRaw::unpack_thumb(void) that are stored in a libraw.dll. These functions one by one load data from file... I've been reading about P/Invoke but i'm not sure how to invoke them. Can anyone show me an example how to use all of these functions together in C# to load file (raw image stored in folder) or just how to PIvoke one of them. Thanx!

    Read the article

  • Passenger problem: "no such file to load" -- /config/environment

    - by Mason Jones
    I've been researching this one and found references to similar problems here and there, but none of them has led to a solution yet. I've installed passenger (2.2.11) and nginx (0.7.64) and when I start things up and hit a Rails URL, I get an error page informing me of a load error: no such file to load -- /path/to/app/config/environment From what I've found online this appears to be some sort of a user/permissions error, but I've tried all the logical fixes: I've made sure that /config/environment.rb is not owned by root, but by a webapp user. I've tried setting passenger_default_user, I've tried setting passenger_user_switching off. I've even tried setting the nginx user, though that shouldn't matter much. I've gotten some differing results, but nothing's actually worked. I'm hoping someone may have the magical combination of settings and permissions for this. I may try backing down to an earlier version of Passenger, because I've never had this issue before; it's been a little while since I set up Passenger though. Thanks for any suggestions.

    Read the article

  • Twitter RSS feed, [domdocument.load]: failed to open stream:

    - by dave1019
    hi i'm using the following: <?php $doc = new DOMDocument(); $doc->load('http://twitter.com/statuses/user_timeline/XXXXXX.rss'); $arrFeeds = array(); foreach ($doc->getElementsByTagName('item') as $node) { $itemRSS = array ( 'title' => $node->getElementsByTagName('title')->item(0)->nodeValue, 'desc' => $node->getElementsByTagName('description')->item(0)->nodeValue, 'link' => $node->getElementsByTagName('link')->item(0)->nodeValue, 'date' => $node->getElementsByTagName('pubDate')->item(0)->nodeValue ); array_push($arrFeeds, $itemRSS); } for($i=0;$i<=3;$i++) { $tweet=substr($arrFeeds[$i]['title'],17); $tweetDate=strtotime($arrFeeds[$i]['date']); $newDate=date('G:ia l F Y ',$tweetDate); if($i==0) { $b='style="border:none;"'; } $tweetsBox.='<div class="tweetbox" ' . $b . '> <div class="tweet"><p>' . $tweet . '</p> <div class="tweetdate"><a href="http://twitter.com/XXXXXX">@' . $newDate .'</a></div> </div> </div>'; } return $tweetsBox; ?> to return the 4 most recent tweets from a given timeline (XXXXX is the relevant feed) It seems to work fine but i've recently been getting the following error sporadically: PHP error debug Error: DOMDocument::load(http://twitter.com/statuses/user_timeline/XXXXXX.rss) [domdocument.load]: failed to open stream: HTTP request failed! HTTP/1.1 502 Bad Gateway I've read that the above code is dependant on Twitter beign available and I know it gets rather busy sometimes. Is there either a better way of receiving twits, or is there any kind of error trapping i could do to just to display "tweets are currently unavailable..." ind of message rather than causing an error. I'm usnig ModX CMS so any parse error kills the site rather than just ouputs a warning. thanks.

    Read the article

  • Upgrade or replace?

    - by Felix
    My current PC is about four years old, although I have made upgrades to it throughout its existence. The current specs are: (old) Intel Pentium D 2.80Ghz (32K L1 / 2M L2), Gigabyte 945GCMX-S2 motherboard (old) 2.5GB DDR2 (slot0: 512MB @ 533Mhz; slot1: 2GB @ 667Mhz) (new) HIS Radeon HD 4670 - I think this is limited by the motherboard not supporting PCIe 2.0 (?) (old) WD Caviar 160GB - pretty slow (new) WD Caviar Black 640GB (if any more specs are relevant, let me know and I'll add them) Now, on to my question. I've been having performance issues lately, both in video games and in intensive applications. A couple of examples: Android application development (running Eclipse and the Android emulator) is painfully slow (on Linux). I only realized this when, at my new job as an Android dev, both tools are MUCH quicker. (I'm not sure what CPU I have there) The guys at my new job got me NFS Hot Pursuit, in which I barely get like 5-10FPS, even with graphics options turned all the way down My guess is that the bottleneck in my system is my CPU, so I'm thinking of upgrading to a Quad Core i5 + new motherboard + 4GB DDR3 (or more, 'cause I know you'll all jump and say 8GB minimum). Now: Is that a good idea? Is my CPU really a bottleneck, or is the whole system too old and I should replace it? I run Windows 7 on the old, 160GB HDD (which is on IDE, by the way). Could this slow down games as well? Should I get a new drive for Windows if I want to play new games? I know nothing about power supplies. Could that be a problem / will it be a problem if I upgrade to an i5? How come DiRT2 works on full graphics settings (pretty amazing graphics by the way) and NFS Hot Pursuit pulls only 5-10FPS?

    Read the article

  • How to get Flex app to load quicker?

    - by Shef
    We've got an app written in Flex that displays data from our app. The .swf file is only 427kb, but it takes a full five seconds to load in Firefox. This is a headache for our users because they need to access the page that contains the app frequently. (The app displays documents, and it's really slow to march through a list of them). I've confirmed that it's not a slow web server problem. The .swf appears to be cached in the browser. Firebug reports that every time the web page accesses the .swf, the app server returns a "304 Not Modified" response, meaning that the load time from the server is almost zero. Is there anything we can do to debug this issue? Or is the Flash player just slow?

    Read the article

  • Recieveing "EXC_BAD_ACCESS" before any UI elements load (on device only)

    - by morgancodes
    Everything works fine on the simulator, but I gete EXC_BAD_ACCESS when I try to load my app on the device. I've put in a bunch of NSLogs to try and catch where it happens, but the log statements are never reached. Also, the UI doesn't load. So, seems the problem is happening before any of my code is reached. I tried a clean build, no luck. Also tried building and installing a different app, which works fine. So, looks like there's something wrong with my app, but it's something that happens before any of my code gets called. So, any ideas?

    Read the article

  • Load local Html file doesn't refer the js file in UIWebView

    - by Hero Vs Zero
    I am working with UIWebView project and I want to load an HTML file from a project resource. It is working fine when I run from the URL, but when I view the HTML file locally, JS files are not loaded. Loading the local HTML local file doesn't refer to js files in UIWebView. Here's my code to load the HTML file project local resource and does't refer the js file: NSString *path = [[NSBundle mainBundle] pathForResource:@"textfile" ofType:@"txt"]; NSError *error = nil; NSString *string = [[NSString alloc] initWithContentsOfFile:path encoding:NSUTF8StringEncoding error:&error]; NSString *path1 = [[NSBundle mainBundle] bundlePath]; NSURL *baseURL = [NSURL fileURLWithPath:path1]; NSLog(@"%@ >>> %@",baseURL,path); [webview loadHTMLString:string baseURL:baseURL]; This code doesn't find JS files in UIWebView, even though it loads image files from the project resource successfully.

    Read the article

  • rvm `require': no such file to load -- rubygems (LoadError)

    - by xxd
    run a ruby code got error "rvm `require': no such file to load -- rubygems (LoadError)" bash-3.2$ rvm --default ruby-2.0.0-p451 -bash-3.2$ rvm list rvm rubies =* ruby-2.0.0-p451 [ x86_64 ] -bash-3.2$ gem list --local *** LOCAL GEMS *** bigdecimal (1.2.0) bundler (1.5.3) bundler-unload (1.0.2) executable-hooks (1.3.1) gem-wrappers (1.2.4) io-console (0.4.2) json (1.7.7) minitest (4.3.2) net-ssh (2.9.1) psych (2.0.0) rake (0.9.6) rdoc (4.0.0) rubygems-bundler (1.4.2) rvm (1.11.3.9) test-unit (2.0.0.0) -bash-3.2$ gem list --local rubygems *** LOCAL GEMS *** rubygems-bundler (1.4.2) to run the script: ruby test.rb `require': no such file to load -- rubygems (LoadError) $ cat test.rb require 'rubygems' require 'net/ssh' Net::SSH.start(............. what's going on? please advice. thanks

    Read the article

  • How can I optimize my ajax calls to deliver at 60ms.

    - by Quintin Par
    I am building an autocomplete functionality for my site and the Google instant results are my benchmark. When I look at Google, the 50-60 ms response time baffle me. They look insane. In comparison here’s how mine looks like. To give you an idea my results are cached on the load balancer and served from a machine that has httpd slowstart and initcwnd fixed. My site is also behind cloudflare From a server side perspective I don’t think I can do anything more. Can someone help me take this 500 ms response time to 60ms? What more should I be doing to achieve Google level performance? Edit: People, you seemed to be angry that I did a comparison to Google and the question is very generic. Sorry about that. To rephrase: How can I bring down response time from 500 ms to 60 ms provided my server response time is just a fraction of ms. Assume the results are served from Nginx - Varnish with a cache hit. Here are some answers I would like to answer myself assume the response sizes remained more or less the same. Ensure results are http compressed Ensure SPDY if you are on https Ensure you have initcwnd set to 10 and disable slow start on linux machines. Etc. I don’t think I’ll end up with 60 ms at Google level but your collective expertise can help easily shave off a 100 ms and that’s a big win.

    Read the article

  • Delay-Load equivalent in unix based systems

    - by saran
    What is the delay load equivalent in unix based system. I have a code foo.cpp, While compiling with gcc I link it to shared objects(totally three .so files are there.).Each of the .so file for different option. ./foo -v needs libversion.so ./foo -update needs libupdate.so I need the symbol for those libraries should be resolved only at the run time. ./foo -v should not break even if libupdate.so library is not there. It is working in windows using the delay load option(in properties of dll). What is its equivalent in unix systems. Will '-lazy' option does the same in UNIX?. If so,Where to include this option? (in makefile or with linker ld). I am not good in unix. Please help me.. Thanks in advance.

    Read the article

  • Load only some columns with Hibernate native SQL queries

    - by Alessandro Dionisi
    I have a table on the database and I want to load only some columns from the result set. I defined a native sql query in the hbm file: <sql-query name="query"> <return alias="r" class="RawData"/> <![CDATA[ SELECT DESCRIPTION as {r.description} FROM RAWD_RAWDATAS r WHERE r.RAWDATA_ID=? ]]> </sql-query> This query however fails with error: could not read column value from result set: RAWDATA1_14_0_; Invalid column name SQL Error: 17006, SQLState: null, because Hibernate tries to load all fields from the result set. I found also a bug in Hibernate JIRA (http://opensource.atlassian.com/projects/hibernate/browse/HHH-3035). Anyone knows how to accomplish this task with a workaround?

    Read the article

  • Windows Server 2008 R2 grinds to a screeching halt during file copy operations

    - by skolima
    When my Windows Server 2008 R2 machine is performing any large disk operations (copying 10GB files from one drive to another, copying similar file over network, merging HyperV snapshots, compressing large files), performance of the whole machine slows down terribly, everything becomes unresponsive. This is noticeable in any situation when the disk access is large enough not to fit in the cache. Are there any settings available for tuning this behaviour? I can accept slower file transfer if this would give me more responsiveness. System details: Dell Optiflex 960, Core 2 Quad Q9650, 8GB RAM, 2 SATA drives - 320GB (ST3320418AS) and 1TB (ST31000528AS), NCQ active on both, Intel 82564LM-3 Gigabit Ethernet, ATI HD 3450 graphics, Intel ICH10 bridge. We have multiple machines like this, every one is exhibiting the same behaviour. I though this was overkill for a workstation, apparently I was mistaken. Update: I guess I shouldn't have mentioned the HyperV at all. The above configuration is a standard workstation setup at the company I work for, this is not a server of any kind. I have at most 3 virtual machines working, and usually I'm the only person accessing them. Never the less, the slowdown occurs even when no VMs are running. On a Linux machine I'd simply ionice the copy process and I could forget about it, is there any way to manage IO priorities on Windows?

    Read the article

  • Ubuntu's garbage collection cron job for PHP sessions takes 25 minutes to run, why?

    - by Lamah
    Ubuntu has a cron job set up which looks for and deletes old PHP sessions: # Look for and purge old sessions every 30 minutes 09,39 * * * * root [ -x /usr/lib/php5/maxlifetime ] \ && [ -d /var/lib/php5 ] && find /var/lib/php5/ -depth -mindepth 1 \ -maxdepth 1 -type f -cmin +$(/usr/lib/php5/maxlifetime) ! -execdir \ fuser -s {} 2> /dev/null \; -delete My problem is that this process is taking a very long time to run, with lots of disk IO. Here's my CPU usage graph: The cleanup running is represented by the teal spikes. At the beginning of the period, PHP's cleanup jobs were scheduled at the default 09 and 39 minutes times. At 15:00 I removed the 39 minute time from cron, so a cleanup job twice the size runs half as often (you can see the peaks get twice as wide and half as frequent). Here are the corresponding graphs for IO time: And disk operations: At the peak where there were about 14,000 sessions active, the cleanup can be seen to run for a full 25 minutes, apparently using 100% of one core of the CPU and what seems to be 100% of the disk IO for the entire period. Why is it so resource intensive? An ls of the session directory /var/lib/php5 takes just a fraction of a second. So why does it take a full 25 minutes to trim old sessions? Is there anything I can do to speed this up? The filesystem for this device is currently ext4, running on Ubuntu Precise 12.04 64-bit. EDIT: I suspect that the load is due to the unusual process "fuser" (since I expect a simple rm to be a damn sight faster than the performance I'm seeing). I'm going to remove the use of fuser and see what happens.

    Read the article

  • Load python module not from a file

    - by user575061
    Hello, I've got some python code in a library that attempts to load a simple value from a module that will exist for the applications that use this library from somemodule import simplevalue Normally, the application that uses the library will have the module file and everything works fine. However, in the unit tests for this library the module does not exist. I know that I can create a temporary file and add that file to my path at runtime, but I was curious if there is a way in python to load something in to memory that would allow the above import to work. This is more of a curiosity, saying "add the module to your test path" is not helpful :P

    Read the article

  • jQuery $.getJSON: "Failed to load resource: cancelled"

    - by Alex
    I'm having problem loading a json resource from a local rails app with jQuery 1.4.4 The json is valid (based on jsonlint.com) and I can download it properly if I'm requesting it from other sources. In webkit (Safari), I got this error: Failed to load resource: cancelled Response Header on Firebug: Content-Type application/json; charset=utf-8 Set-Cookie geoloc=toulouse; path=/; Connection close Server thin 1.2.7 codename No Hup jQuery code to load json: $.getJSON("http://127.0.0.1/search_agenda", {'edition': edition, 'categories': categories}, function(data){ console.log(data); } });

    Read the article

  • Ubuntu 12.04 VirtualBox on powerful W7 quite slow

    - by wnstnsmth
    I own a Thinkpad T420s with 8GB RAM, 160 GB SSD and a quite fast i7 processor. Summa summarum a very fast computer that works perfectly. Now, I am not very impressed by the performance of my Ubuntu 12.04 virtual machine running on VirtualBox 4.1.18. I assume that Virtual Machines are always a bit slower than the guest system, still I think it should be more performant given the hardware settings I give it: 4096 MB RAM 1 CPU without CPU limitation (I would like to give it more but then it does not seem to work - I am not experienced in this maybe somebody could give me advice on this too) Activated PAE/NX, VT-x/AMD-V and Nested Paging 96 MB Graphics Memory (no 2D or 3D acceleration) ~ 14 GB disk space, currently about 7 GB are used Maybe I misconfigured something, could you give me a hint please? Thanks! Edit: What I mean by slow is that for example switching tabs in the browser (whether FF or Chrome) only goes with a 0.5s delay or something, as well as switching application windows and/or double-clicking applications in the dock to get all open windows.. opening Aptana takes about a minute whereas opening something like Photoshop on the guest system takes 5 seconds

    Read the article

  • Load all tab bar views when application first runs

    - by codenoobie
    Here's the problem. I have a tab bar controller with 4 separate views. When I navigate from the first view to the second view, it takes some time to load up the second view. What I want to do is be able to load and initialize all of my tab bar views during my splash screen. That way, when the user navigates between the tab views, there is no wait time. So the question is... how do I manually initialize my individual tab bar views in my app delegate? And once again, thank you stackoverflow!

    Read the article

  • New i7 is slower than old Core 2 Duo? Why? (BIOS programming)

    - by DrChase
    I've always wondered why the companies who make BIOS' either have terrible engineering psychologists or none at all. But without wasting your time further with random speculative questions, my real question is as follows: Why does my new computer run slower than my old computer? Old Computer: Intel Core 2 Duo CPU @ 3.0 Ghz (stock) 4GB OCZ DDR2 800 RAM Wolfdale E8400 mb nVidia GeForce 8600 GT New Computer: Intel Core i7 920 @ ~3.2 Ghz 6 GB OCZ DDR3 1066 RAM EVGA x58 SLI LE motherboard nVidia GeForce GTX 275 Vista x64 Home Premium on both. "Run slower" is defined as: - poorer FPS performance in the same games, applications - takes longer to start up - general desktop usage (checking email, opening up files, running exe's) is noticeably slower At first I thought I must've not set something up in the BIOS or something. But I have no idea how to set anything in the bios except for "Dummy O.C.", which brought me to ~3.2 Ghz. But beyond that I have no idea. I've been reading stuff about "ram timing" and voltages and the like but I really have no idea about that stuff. I'm a psychologist who has a basic understanding in building his own computers, not a computer scientist. Can someone give me some wisdom that might guide me to the reason my new computer is worse than my older one? I'm sorry if this is a bad question, or not appropriate to SO. I'm just pretty frustrated now and you all have helped me in the past so I figured I'd give it a shot. Thanks for your time.

    Read the article

  • ZFS Data Loss Scenarios

    - by Obtuse
    I'm looking toward building a largish ZFS Pool (150TB+), and I'd like to hear people experiences about data loss scenarios due to failed hardware, in particular, distinguishing between instances where just some data is lost vs. the whole filesystem (of if there even is such a distinction in ZFS). For example: let's say a vdev is lost due to a failure like an external drive enclosure losing power, or a controller card failing. From what I've read the pool should go into a faulted mode, but if the vdev is returned the pool should recover? or not? or if the vdev is partially damaged, does one lose the whole pool, some files, etc.? What happens if a ZIL device fails? Or just one of several ZILs? Truly any and all anecdotes or hypothetical scenarios backed by deep technical knowledge are appreciated! Thanks! Update: We're doing this on the cheap since we are a small business (9 people or so) but we generate a fair amount of imaging data. The data is mostly smallish files, by my count about 500k files per TB. The data is important but not uber-critical. We are planning to use the ZFS pool to mirror 48TB "live" data array (in use for 3 years or so), and use the the rest of the storage for 'archived' data. The pool will be shared using NFS. The rack is supposedly on a building backup generator line, and we have two APC UPSes capable of powering the rack at full load for 5 mins or so.

    Read the article

  • SubSonic Change DropDown Value for Load Drops SUB

    - by GTJR
    I used the subsonic generator to create some aspx pages, It works fine. On some of the pages it automaticaly generated the dropdown boxes for foreign key values. How Can change that value in the load drops code? Or where I need to change it. For instance I have a workers table and a workersweek table. The workers table has a workerid,firstname and lastname field and the workersweek has a workerID field. The generator automatically set it up to show the firstname in the dropdown. I want to change the value to be both firstname and lastname. I am sure I will have to add code that does something like firstname + " " + Lastname. I am just not sure where to do it withing the code that was generated. I see the load drops sub, but it does not seem like that is the one I need to modify.

    Read the article

  • jquery, ajax, load post result to div

    - by mike
    Hello, I have a form that I need to post and show the result in a div. I'm not having any problems making the ajax call but I can't seem to figure out how to load the result to a div. I have tried: $.ajax({ type: 'POST', data: $('#someForm').serialize(), url: 'http://somedomain.com/my/url', success: function(data) { $('#someDiv').load(data); } }); but it does not seem to work properly. I'm nit sure if this should even work but the result is the the page i'm posting from being loaded into the div and not the url I'm posting to. Any help would be great! Thanks!

    Read the article

< Previous Page | 271 272 273 274 275 276 277 278 279 280 281 282  | Next Page >