Search Results

Search found 17852 results on 715 pages for 'load balancer'.

Page 413/715 | < Previous Page | 409 410 411 412 413 414 415 416 417 418 419 420  | Next Page >

  • Planning office network [closed]

    - by gakhov
    I'm planning to setup my office network from scratch and want to ask professional opinions or tips. My office is connected to Internet with Cable connection (100Mb/s). The devices i would like to connect are VoIP Phone (RJ-11), TV (WiFi/LAN), 3 laptops (WiFi), a few smartphones (WiFi), iPad (WiFi), Kindle (WiFi) and, probably, MediaServer (WiFi/LAN). As you can see, the most load will be on WiFi connections (probably, even if TV supports WiFi it's better to connect it by LAN?). So, i need help to choose the best routers combination (or even one?) to support stable connections for all these devices and minimize the total number of routers/adapters. Any thoughts? Thank you!

    Read the article

  • Self hosted Wordpress freezes for 20 seconds before sending the page

    - by TheoJones
    I have a website with Wordpress and PHPBB integrated - and it's been fine for months.. then all of a sudden, it's unbelievably slow to load. There are no errors in the apache logs, but looking at the page in firebug shows a 20 second pause, before the page is delivered. more confusing, on the same server, same apache installation, PHPBB loads in 30ms, no delay. I tried the hogdetector wordpress plugin, which indicates that the delay is before the header is sent - which seems to agree with firebug. Any ideas how to troubleshoot this one further?

    Read the article

  • What does "Windows is not a real-time operating system" mean?

    - by hydroparadise
    I came across an application called LatencyMon, that apparently does latency monitoring. I have always understood the more of a load you put on the processor, the less responsive, or more latent, the system becomes. However, in the second section of the LatencyMon page, the first sentence says, "Windows is not a real-time operating system". That got me thinking. I mean, is this any different from any other operatiing system like linux, unix, or OS X? Are there any "Real-Time" operating systems? Or is the merely a marketing scheme to get you to buy their product? EDIT: Also, are there any examples of RTOS's out there?

    Read the article

  • 'Buy the app' landing page implementations

    - by benwad
    My site (using Django) has an app that I'm trying to push - I currently have a piece of middleware that redirects the user to a page advertising the app if they're accessing the page on the iPhone, then setting a cookie so that the user isn't bugged by the message every time they visit the site. This works fine, however checking the page with the mobile Googlebot checker shows that the Googlebot gets stuck in the redirect (since it doesn't store cookies) and therefore won't index the proper content. So, I'm trying to think of an alternative implementation that won't hurt the site's Google ranking and won't have any other adverse effects. I've considered a couple of options: Redirect (the current solution), but don't redirect if the user agent matches the Googlebot's UA string. This would be ideal, however I'm not sure if Google like their bot being treated differently from other users, and I'm afraid the site's ranking may be somehow penalised if I go ahead with this. Use a Javascript popup instead of a redirect. This would make sure the Googlebot finds the content it needs, however I envision this approach causing compatibility issues with the myriad mobile devices/browsers out there, and may affect the page load time. How valid are these options? And is there a better option for implementing this feature out there? I've tried researching this topic but surprisingly can't find any reputable-looking blog posts that explore this topic. EDIT: I posted this on SF because it seemed unsuitable for SO, but if there's another site that would be better for this issue then I'd be happy to move the question elsewhere.

    Read the article

  • What is a proper MySql replication configuration for frequent db updates and rare selects?

    - by serg555
    We currently have 1 master db on its own server and slave db on app server. App executes very frequent but light updates (like increasing counters), and occasional (once in a few minutes) heavy selects (which is the most important part of the app). When app was connected only to master db there were no performance issues. With slave db introduction CPU load avg on app server increased to about 6-10 during that heavy select period (from 3-4 as before). When server doesn't run those frequent updates it seems like performance for selects stays within the limits. So I have a feeling that those updates is what is causing the performance drop (also these frequent updates are not critical so if slave db doesn't have them in sync with master for some time it would be ok). What would be a good db replication setup for such kind of app? What are the replication parameters we could tweak? Thanks.

    Read the article

  • Oracle Enterprise Manager 12c Testing-as-a-Service Solution

    - by user810030
    With organizations spending as much as 50 percent of their QA time with non-test related activities like setting up hardware and deploying applications and test tools, the cloud will bring obvious benefits. A key component of Oracle Enterprise Manager our current Application Quality Management products have been helping our customers with application load testing, functional testing and test process management, but also test data management, data masking and real application testing. These products enable customers to thoroughly test applications and their underlying infrastructure to help ensure the best quality, scalability and availability prior to deployment.  Today, Oracle announced Oracle Enterprise Manager 12c Testing-as-a-Service Solution . This solution will allow users to significantly decrease the time needed to setup a complete test environment, while enhancing testing efficiency. Please read the Press Release mentioned above and join us in our Enterprise Manager LinkedIn Group discussion on this topic. (need to be a member). Or visit our booth this week during the EuroSTAR Software Testing conference in Amsterdam where we can demo this solution  I hope you find this helpfull Stay Connected: Twitter |  Facebook |  YouTube |  Linkedin |  Newsletter

    Read the article

  • NFS users getting a laggy GUI expierence

    - by elzilrac
    I am setting up a system (ubuntu 12.04) that uses ldap, pam, and autofs to load users and their home folders from a remote server. One of the options for login is sitting down at the machine and starting a GUI session. Programs such as chormium (browser) that preform many read/write operations in the ~/.cache and ~/.config files are slowing down the GUI experience as well as putting strain of the NFS server that is causing other users to have problems. Ubuntu had the handy-dandy XDG_CONFIG_HOME and XDG_CACHE_HOME variables that can be set to change the default location of .cache and .config from the home folder to somewhere else. There are several places to set them, but most of them are not optimal. /etc/environment pros: will work across all shells cons: cannot use variables like $USER so that you can't make users have different new locations for .cache and .config. Every users' new location would be the same directory. /etc/bash.bashrc pros: $USER works, so you can place them in different folders cons: only gets run for bash compatible shells ~/.pam_environment pros: works regardless of shell cons: cannot use system variables (like $USER), has it's own syntax, and has to be created for every user

    Read the article

  • Windows Xp, Svchost.exe connecting to different ips with remote port 445

    - by Coll911
    Im using Windows Xp professional Sp2 Whenever i start my windows, svchost.exe starts connecting to all the possible ips on lan like from 192.168.1.2 to 192.168.1.200 The local port ranges from 1000-1099 and the remote port being 445. After its done with the local ips, it starts connecting to other random ips. I tried blocking connections to the port 445 using the local security polices but it didn't work Is there any possible way i could prevent svchost from connecting to these ips without involving any firewall installed ? since my pc slows down due to the load I'd be thankful for any advices

    Read the article

  • Differentiating between user script input formats

    - by KChaloux
    I have a .NET project at work that provides a couple of (Iron)Python scripts to the customers, to allow them to customize the output of the program. The application generates code for certain machines, and supports a couple of different formats. Until recently, we only provided a script for one format. We're expanding upon that to include support for the others. If the user is using a script, they select their input script before generating the output code. A script designed for Format1 output is going to cause errors if they're trying to generate Format2 output. I need to deal with this. One option would just be to let the customers use common sense, and if they load the wrong script it will just fail, or worse, produce inaccurate data. I'm inclined to provide a little more protection than that. At the moment I'm considering putting a shebang-style comment line at the top of the script, ala: # OUTPUT - Format1 If the user tries to run a Format2 process with a Format1 script, it will warn them. Alternatively I could create different file extensions for the input scripts that vary by type. The file-type comment approach helps prevent the script from actually loading improperly, at the cost of failing to warn the user until they've already selected it, via a dialog box. Using different file extensions would allow me to cut down on visual clutter when providing a File Dialog, but doesn't actually stop them from loading the wrong script. So I'm really not sure if the right approach is to just leave it alone, or provide some safeguards.

    Read the article

  • Windows server response time very high

    - by Nagaraju Bandla
    Server Specs Windows Server 2008 R2 64 bit Provider : Fasthosts .Net Framework: 4.0 6 GB RAM (its using 4.6 GB) i have a website with thousands of pages structured like folderone/1/one to 500.aspx folderone/2/one to 500.aspx . . folderone/500/one to 500.aspx To load this pages for the first time after the release, for each folder it takes about 20 to 30 minutes and once one page is loaded the rest of the pages loads fine. This happens for all folders. And this repeats every time i restart the server, when a added anything to app_code or if i change the web.config. My site is mainly works Google and due to this problem its giving errors. Any help will be highly appreciated please. i am happy to buy a beer for you if its resolved. Thanks in advance...

    Read the article

  • Ubuntu 12.04 takes too long to boot

    - by msPeachy
    I've recently encountered the following error message: mount: mounting /dev/disk/by-uuid/3f7f5cd9d-6ea3-4da7-b5ec-**** on /root failed: Invalid argument mount: mounting /sys on /root/sys failed: No such file or directory mount: mounting /dev on /root/dev failed: No such file or directory mount: mounting /sys on /root/sys failed: No such file or directory mount: mounting /proc on /root/proc failed: No such file or directory Target file system doesn't have /sbin/init. No init found. Try passing init= bootarg. Busybox v1.18.5 (Ubuntu 1:1.18.5-1ubuntu4) built-in shell (ash) Enter 'help' for a list of built-in commands. (initramfs) _ I run sudo fsck /dev/sda2 which is the Ubuntu ext4 root partition via LiveCD. It checked and fixed the file system. The next time I boot, Ubuntu started to load with the Ubuntu logo and the dots underneath for several hours (with the mouse pointer active on the screen), I even let the computer on overnight but still it did not successfully boot or got to the login screen in the morning. I booted again with the LiveCD and checked the NTFS partitions with ntfsfix and again the NTFS partitions was checked and fixed successfully. I also edited my fstab and commented out the lines that auto-mounts the NTFS partitions. The next time I boot, it took almost 20 minutes for Ubuntu to get to the login screen, after typing the password it took an additional 10 minutes for Ubuntu to get to the desktop. On the desktop, it take several minutes to open any program, displaying the Dash alone takes 5 minutes! Is there a fix for this without having to reinstall Ubuntu? I don't see or get any errors, Ubuntu is just taking too long to boot and to run programs. Please help!

    Read the article

  • Linux : How to convert media files to DVD format and burn?

    - by James.Elsey
    I have a load of media files on my PC, mostly AVI/MKV and some mpegs. On windows, I would use ConvertXToDVD to convert these to DVD format, and to burn to disc. That application also lets you save a bit of space on the DVD to put the original file in its AVI format on as well. How can I do this on linux? What are the alternatives to this Windows application? I could try to run ConvertXtoDVD under wine but I would prefer to find a native linux solution Thanks in advance!

    Read the article

  • Windows XP not starting up anymore (BSOD)

    - by Richard
    When I tried to start my computer running Windows XP yesterday, as I turned on the screen it was stuck at the "Welcome" screen. I had to reset it. After that, I got stuck at the point where the BIOS should load the OS - I turn on the PC, the Motherboard and BIOS post message shows up, then the blinking "thing" in the top left corner shows up 2 seconds, then everything is black. After I let it run for some time in that mode, it automatically restarts. I'm writing this here from a Ubuntu 9.10 Live CD. What can I do to fix this? How can I find out what is wrong? I'm a programmer and okay with doing (very) advanced things in that terms, so if you know something just tell me and I'll try

    Read the article

  • how to get wireless to work on my HP Compaq nx6325 Notebook on 12.04

    - by user211487
    When my laptop ran XP, the wireless worked perfectly. Now the only way I get internet is having it wired in my living room - which is MASSIVELY inconvenient for me seeing as my room is outside. I have downloaded the driver from 'Additional Drivers', reset my computer (with difficulty, it also doesn't seem to like turning off, so I have to 'kill' it) and still nothing! I have looked all over the internet trying to find solutions but I just cant seem to get it to work. I have a USB wireless stick but I can't load the driver from the CD. I got this comment on a previous question: "Using either Snaptic Package Manager or Ubuntu's own Software Centre, remove "bcmwl-kernel-source" if it's already installed. Next, search for "firmware-b43-installer” and “b43-fwcutter” and install both. Finally, running Software Sources and checking the Additional Drivers tab you should find Broadcom STA Wireless Driver that you can select, install and on reboot hopefully have wireless working ok..." Which I replied: "Right, so I tried to find/remove "bcmwl-kernel-source" but couldn't find anything. I managed to install both "firmware-b43-installer” and “b43-fwcutter” using terminal, after finding out how it worked. And lastly, I couldn't figure out how to find 'Broadcom STA Wireless Driver' - still no more additional drivers. And my wireless is still not working. My laptop has a button that turns wifi on and off and since I put Ubuntu on it, it hasn't worked. And also, my computer still doesn't seem to restart properly - it just gets stuck on the ubuntu loading screen." - hope that helps, somewhat I am VERY new to Linux, not very technical and in need of help! -Running Ubuntu 12.04 32bit on a HP Compaq nx6325 Notebook Thank you to anyone who replies, its much appreciated

    Read the article

  • Restart single uWSGI application (when it's in emperor mode)

    - by Oli
    I'm running uWSGI in emperor mode to host a bunch of Django sites based on their individual configs. These are supposed to update when it detects a change in the config file and this largely works when I just touch uwsgi.ini the relevant file. But occasionally I'll mess something up in the Django site and the server won't load. Yeah, yeah, I should be testing better but that's not really the point. When this happens, uWSGI seems to mark the site as dead and stops trying to run it (seems to make sense). Even after I fix the underlying issue, no amount of touching will get that site's uWSGI process up and running. I have to reload the whole uWSGI server (knocking dozens of sites out at once for a few seconds). Is there a way to force uWSGI to just reload one of its sites?

    Read the article

  • Can I restore Windows 8/install Windows 7 from BIOS?

    - by Tom
    I recently got an ASUS K55A series laptop with Windows 8 on it, and have been trying to load Windows 7 on it for days to no avail, and recently I discovered how to get my Windows 7 install DVD to boot from the BIOS, but I deleted all of my Windows 8 system information from both partitions of my HDD and Windows 7 setup says it cannot install on the disc because of a partition format issue. I did not delete the recovery HDD partition for Windows 8, but I can't get the HDD to show up in my boot menu in BIOS, and none of the F keys work to get to recovery mode (only DEL and F2 work to get me into BIOS)

    Read the article

  • Mitigating the 'firesheep' attack at the network layer?

    - by pobk
    What are the sysadmin's thoughts on mitigating the 'firesheep' attack for servers they manage? Firesheep is a new firefox extension that allows anyone who installs it to sidejack session it can discover. It does it's discovery by sniffing packets on the network and looking for session cookies from known sites. It is relatively easy to write plugins for the extension to listen for cookies from additional sites. From a systems/network perspective, we've discussed the possibility of encrypting the whole site, but this introduces additional load on servers and screws with site-indexing, assets and general performance. One option we've investigated is to use our firewalls to do SSL Offload, but as I mentioned earlier, this would require all of the site to be encrypted. What's the general thoughts on protecting against this attack vector? I've asked a similar question on StackOverflow, however, it would be interesting to see what the systems engineers thought.

    Read the article

  • 2 workstations won't connect to most websites, but will connect to some

    - by Dean
    I have a very frustrating issue I wasn't able to solve: 2 workstations which are used by the same user are not able to connect to most websites receivin a timeout, however they will load some websites specificly from my country. They are able to get the website addresses via DNS. Both stations have their internet connection through a remote router. Other stations in the same LAN are connecting fine. Here's what I tried: Virus scan Renewing IPs Reseting the workstations Moving one workstation to a different RJ-45 in the wall Reseting the hub and switch Checking the hosts file DNS flush Nothing seems to help. I am preparing a CD with more AV tools to see if there's anything hiding on the stations. UPDATE: It was an incorrect configuration in "Internet Options". I configured the correct proxy and now it works.

    Read the article

  • Amazon EC2 vs Dedicated server at Hetzner, what's the use for EC2?

    - by C-Blu
    After searching the web I still can't find the reason to use EC2. What's the point to scale EC2? If you expect a huge burst in traffic, they say. OK, but what if you already have a couple of sites with good traffic, and for example medium reserved EC2 instance is not enough. You are paying $36.60(medium reserved for 1year) in EU(Ireland) + traffic + optional expenses for databases and S3 if you use them. Of course as some point when you are under $56.6-$66.1 you can optimize your hosting costs with Amazon EC2. But when you get at some point if purchase EX4 server from Hetzner, it will surpass your perfomance needs for a long time, before you get a massive traffic. (I am wrong?) CPU: i7-2600 Quadcore (3.4-3.8 Ghz) RAM: 16 GB HDD: 2x3 TB SATA (6 Gbit/s) - I think that disc performance of a dedicated is better then of Amazon EBS Traffic: 10 TiB in month included. This is what you get from Hetzner for $56(- 19% VAT) or $66 for EU residents. Please, tell me what's the reason to use Amazon? Which load won't a server from Hetzner take, but Amazon Auto Scaling will? The maintenance of dedicated vs EC2 is still the same? Or hardware failure at Amazon, won't ruin your EBS storage? I'm still not at the level when I need expensive hosting, but want to know beforehand, just to be sure if Amazon infrastructure is better then pure performance of Hetzner's hardware.

    Read the article

  • Optimising news fetching

    - by aceBox
    I have a web scraper for scraping news from different sources in wp7. My current appraoch for doing this is: load newspapers information from xml file. go to the specified sections and fetch the urls of the news items. go to each url and fetch headline, image, publisher. display using a MVVM architecture of windows phone. The whole thing takes place asynchronously...meaning as soon as url from a section of a newspaper is fetched it is added to the queue, and the second stage consisting of fetching headline, image etc starts... and as soon this is fetched even for one article, it is displayed. Later on as more articles are fetched, they are added on to the list. For the fetching purpose I am using a SmartThreadPool(http://www.codeproject.com/Articles/7933/Smart-Thread-Pool) for windows phone. My problem is that...even for fetching around 80 items (in total) from 9 publications, it is taking more than a minute. How can i speed up the procedure? Note: I have a two stage approach because many times the images are not available with headlines, and are only found in the article.

    Read the article

  • Nginx Tornado Combination Causing 502 Bad Gateway Errors

    - by PlaidFan
    We are facing a problem with inconsistent 502 errors and tracking down the reasons has been a very frustrating exercise. We can reproduce the problem by sending several simultaneous requests quickly. The problem is that several is only in the range of 10 to 20 within a 5 seconds (not a typo). So clearly this type of load should be handled easily. We really like the Nginx + Tornado approach but are considering going to a more traditional (e.g. threading) approach because this problem has been very difficult to solve. I was wondering if you a) know how to fix this issue and b) how we can tracked down the culprit(s). The log files simply identify there being a connection refused. We have the same problem as this post: How do I debug a HTTP 502 error? But there is no answer provided on how to solve the problem so I'm hoping you can help because this may be a common issue with this type of setup. Thanks in advance, Paul

    Read the article

  • APF, IPTABLES, Fedora 15 - Not blocking correctly

    - by RichardW11
    I just got a new remote server which came with Fedora 15. I first tried to run APF but it gave me this error "apf(18031): {glob} unable to load iptables module (ip_tables), aborting.". Which I then set SET_MONOKERN="0" to SET_MONOKERN="1" to resolve the problem. However, with my config file showing BLK_P2P_PORTS="1214,2323,4660_4678,6257,6699,6346,6347,6881_6889,6346,7778" The ports show up as closed, instead of being filtered. Any idea why this would be happening? 22/tcp open ssh 80/tcp open http 443/tcp open https 2323/tcp closed 3d-nfsd 4662/tcp closed edonkey 6346/tcp closed gnutella 6699/tcp closed napster 6881/tcp closed bittorrent-tracker 7778/tcp closed interwise

    Read the article

  • Apache httpd.conf handle multiple domains to run the same application

    - by John Stewart
    So what we are looking for is the ability to do the following: We have an application that can load certain settings based on the domain that it is being accessed from. So if you come from xyz.com we show a different logo and if you come from abc.com we show a different logo. The code is the same, running from same server just detects the domain on the run Now we want to get a dedicated server (any suggestions?) that will enable us to point all the doamins that we want to this server (we change the DNS for the domains to that of our server) and then when the user goes to a certain domain they run the same application. Now as far as I can understand we will need to create a "VirtualHost" in apache to handle this. Can we create a wildcard virtualhost that catches all the domains? I am not an expert with Apache at all. So please forgive if this comes out to be a silly question. Any detailed help would be great. Thanks

    Read the article

  • Intermittent HTTP 401 errors

    - by forthrin
    I am using an Intranet solution which requires basic HTTP login. However, there is an intermittent error which requires me to log in again, and then the server says "Forbidden" whether I give the correct login information or not. To add insult to injury, Safari (and Chrome) seems to show the login dialog for every included resource in the HTML, and it's impossible to cancel this modal dialog sequence, so the whole browser is blocked until I've pressed Esc some 30 odd times. After an hour, I may gain access again, without having really done anything. My questions: What could cause temporal 401 errors? Why do the browsers show the login dialog 30 times per page load (assumedly for every included resource in the HTML from the same domain)?

    Read the article

  • Reverse proxy with SSL and IP passthrough?

    - by Paul
    Turns out that the IP of a much-needed new website is blocked from inside our organization's network for reasons that will take weeks to fix. In the meantime, could we set up a reverse proxy on an Internet-based server which will forward SSL traffic and perhaps client IPs to the external site? Load will be light. No need to terminate SSL on the proxy. We may be able to poison DNS so original URL can work. How do I learn if I need URL rewriting? Squid/apache/nginx/something else? Setup would be fastest on Win 2000, but other OSes are OK if that would help. Simple and quick are good since it's a temporary solution. Thanks for your thoughts!

    Read the article

< Previous Page | 409 410 411 412 413 414 415 416 417 418 419 420  | Next Page >