Search Results

Search found 13645 results on 546 pages for 'bugz r us'.

Page 41/546 | < Previous Page | 37 38 39 40 41 42 43 44 45 46 47 48  | Next Page >

  • PXE Boot Fedora 17 Error

    - by DrifterDave
    When trying to boot into the latest Fedora 17 cd via PXE, I am presented with the following error: PXE dracut: fatal: no or empty root= argument So, I added a root= line to my fedora menu entry (shown below), but receive the following error: dracut Warning: Unable to process initqueue Any assistance would be greatly appreciated. Fedora.menu LABEL 1 MENU LABEL fedora 17 (32-bit) KERNEL fedora/17/i386/vmlinuz0 APPEND method=nfs:192.168.1.101:/srv/install/fedora/17/i386/ lang=us keymap=us ip=dhcp ksdevice=eth1 noipv6 root=/dev/ram0 initrd=fedora/17/i386/initrd0.img ramdisk_size=10000 TEXT HELP Install Fedora 17 (32-bit) ENDTEXT

    Read the article

  • log shipping of biztalk database on SQL server 2008 standard edition

    - by Manjot
    Hi, I want to do log shipping for biztalk databases on SQL server 2008 standard edition (server A) to another SQL server 2008 standard edition (server B). I was told that for biztalk, logshipping is not like standard logshipping. I was able to find 2 links: http://msdn.microsoft.com/en-us/library/cc296836%28v=BTS.10%29.aspx http://msdn.microsoft.com/en-us/library/cc296741%28v=BTS.10%29.aspx but they are not talking about SQL 2008 servers. Can anyone please help in this? Thanks in advance

    Read the article

  • Can thunderbird with -no-remote launch firefox without it?

    - by Simon
    We have two thunderbird profiles, and we launch thunderbird using -no-remote as follows: thunderbird -no-remote -P Profile1 and thunderbird -no-remote -P Profile2 unfortunately, this means that if either of us click on a link in an email, and firefox is already running, we get a warning message telling us that firefox is already running, rather than having it open the link. Is it possible for thunderbird to use firefox's remote, even though it has been launched with -no-remote? (if it makes a difference, this is on Xubuntu Karmic)

    Read the article

  • Software for measuring internet traffic?

    - by Lachlan McDonald
    I'm interested in finding a free piece of software for Windows XP & 7 that allows us to measure both incoming and outgoing internet traffic, but not traffic between users of the local network. I live in a shared household with three others, and we're interested to see which of us is using the largest amount of our monthly internet quota. We're all happy to install the necessary software. Any suggestions?

    Read the article

  • Configuring varnish and django (apache/modwsgi)

    - by Hedde
    I am trying to work out why my application keeps hitting the database while I have setup varnish infront of apache. I think I am missing some vital configuration, any tips are welcome This is my curl result: HTTP/1.1 200 OK Server: Apache/2.2.16 (Debian) Content-Language: en-us Vary: Accept,Accept-Encoding,Accept-Language,Cookie Cache-Control: s-maxage=60, no-transform, max-age=60 Content-Type: application/json; charset=utf-8 Date: Sat, 15 Sep 2012 08:19:17 GMT Connection: keep-alive My varnishlog: 13 BackendClose - apache 13 BackendOpen b apache 127.0.0.1 47665 127.0.0.1 8000 13 TxRequest b GET 13 TxURL b /api/v1/events/?format=json 13 TxProtocol b HTTP/1.1 13 TxHeader b User-Agent: curl/7.19.7 (universal-apple-darwin10.0) libcurl/7.19.7 OpenSSL/0.9.8r zlib/1.2.3 13 TxHeader b Host: foobar.com 13 TxHeader b Accept: */* 13 TxHeader b X-Forwarded-For: 92.64.200.145 13 TxHeader b X-Varnish: 979305817 13 TxHeader b Accept-Encoding: gzip 13 RxProtocol b HTTP/1.1 13 RxStatus b 200 13 RxResponse b OK 13 RxHeader b Date: Sat, 15 Sep 2012 08:21:28 GMT 13 RxHeader b Server: Apache/2.2.16 (Debian) 13 RxHeader b Content-Language: en-us 13 RxHeader b Content-Encoding: gzip 13 RxHeader b Vary: Accept,Accept-Encoding,Accept-Language,Cookie 13 RxHeader b Cache-Control: s-maxage=60, no-transform, max-age=60 13 RxHeader b Content-Length: 6399 13 RxHeader b Content-Type: application/json; charset=utf-8 13 Fetch_Body b 4(length) cls 0 mklen 1 13 Length b 6399 13 BackendReuse b apache 11 SessionOpen c 92.64.200.145 53236 :80 11 ReqStart c 92.64.200.145 53236 979305817 11 RxRequest c HEAD 11 RxURL c /api/v1/events/?format=json 11 RxProtocol c HTTP/1.1 11 RxHeader c User-Agent: curl/7.19.7 (universal-apple-darwin10.0) libcurl/7.19.7 OpenSSL/0.9.8r zlib/1.2.3 11 RxHeader c Host: foobar.com 11 RxHeader c Accept: */* 11 VCL_call c recv lookup 11 VCL_call c hash 11 Hash c /api/v1/events/?format=json 11 Hash c foobar.com 11 VCL_return c hash 11 VCL_call c miss fetch 11 Backend c 13 apache apache 11 TTL c 979305817 RFC 60 -1 -1 1347697289 0 1347697288 0 60 11 VCL_call c fetch deliver 11 ObjProtocol c HTTP/1.1 11 ObjResponse c OK 11 ObjHeader c Date: Sat, 15 Sep 2012 08:21:28 GMT 11 ObjHeader c Server: Apache/2.2.16 (Debian) 11 ObjHeader c Content-Language: en-us 11 ObjHeader c Content-Encoding: gzip 11 ObjHeader c Vary: Accept,Accept-Encoding,Accept-Language,Cookie 11 ObjHeader c Cache-Control: s-maxage=60, no-transform, max-age=60 11 ObjHeader c Content-Type: application/json; charset=utf-8 11 Gzip c u F - 6399 69865 80 80 51128 11 VCL_call c deliver deliver 11 TxProtocol c HTTP/1.1 11 TxStatus c 200 11 TxResponse c OK 11 TxHeader c Server: Apache/2.2.16 (Debian) 11 TxHeader c Content-Language: en-us 11 TxHeader c Vary: Accept,Accept-Encoding,Accept-Language,Cookie 11 TxHeader c Cache-Control: s-maxage=60, no-transform, max-age=60 11 TxHeader c Content-Type: application/json; charset=utf-8 11 TxHeader c Date: Sat, 15 Sep 2012 08:21:29 GMT 11 TxHeader c Connection: keep-alive 11 Length c 0 11 ReqEnd c 979305817 1347697288.292612076 1347697289.456128597 0.000086784 1.163468122 0.000048399

    Read the article

  • Cannot access a very specific site from my router

    - by DJDarkViper
    This is a problem for me because this site is important to me. It's MY website. And sadly my email is hosted on my site (which I cant access either) When I try to access my website when connected to my Linksys E3000 router, these days it simply just doesn't go through. When I ping it, its all Request Timed Out, and when I tracert C:\Users\Kyle>tracert blackjaguarstudios.com Tracing route to blackjaguarstudios.com [199.188.204.228] over a maximum of 30 hops: 1 <1 ms <1 ms <1 ms CISCO26565 [192.168.1.1] 2 16 ms 15 ms 11 ms 11.4.64.1 3 11 ms 9 ms 11 ms rd1cs-ge1-2-1.ok.shawcable.net [64.59.169.2] 4 20 ms 21 ms 22 ms 66.163.76.98 5 37 ms 36 ms 35 ms rc1nr-tge0-9-2-0.wp.shawcable.net [66.163.77.54] 6 112 ms 84 ms 85 ms rc2ch-pos9-0.il.shawcable.net [66.163.76.174] 7 86 ms 89 ms 90 ms rc4as-ge12-0-0.vx.shawcable.net [66.163.64.46] 8 90 ms 84 ms 85 ms eqix.xe-3-3-0.cr2.iad1.us.nlayer.net [206.223.115.61] 9 97 ms 97 ms 99 ms xe-3-3-0.cr1.atl1.us.nlayer.net [69.22.142.105] 10 128 ms 128 ms 126 ms ae1-40g.ar1.atl1.us.nlayer.net [69.31.135.130] 11 101 ms 97 ms 96 ms as16626.xe-2-0-5-102.ar1.atl1.us.nlayer.net [69.31.135.46] 12 100 ms 97 ms 197 ms 6509-sc1.abstractdns.com [207.210.114.166] 13 * * * Request timed out. 14 * * * Request timed out. 15 * * * Request timed out. 16 * * * Request timed out. 17 * * * Request timed out. 18 * * * Request timed out. 19 * * * Request timed out. 20 * * * Request timed out. 21 * * * Request timed out. 22 * * * Request timed out. 23 * * * Request timed out. 24 * * * Request timed out. 25 * * * Request timed out. 26 * * * Request timed out. 27 * * * Request timed out. 28 * * * Request timed out. 29 * * * Request timed out. 30 * * * Request timed out. Trace complete. C:\Users\Kyle> SHAW Cable being my ISP. Figuring this was all something to do with some setting I made on the router, I reset the thing back to factory defaults. Nope. So I'm at a bit of a loss what to do here, as NO device (Computers, Laptops, Tablets, Phones, PS3/ 360, etc) can access my site or its features, so it's not just my computer either. But every other site is just fine. When I connect to my neighbors router, the site comes up just fine. And shes with SHAW as well. What should I do?!

    Read the article

  • What is the best error monitoring tool for Ruby on Rails applications?

    - by marcgg
    I'm looking for something to track the errors being raised across our multiple rails applications on our multiple servers. Ideally: the application failings sends an email to [email protected] and the email is processed in another application. This application will then show us some stats and give us some kinds of auditing tools. A service like Hoptoad might do the trick, but I'm trying to see what's available these days. Ideally free or cheap, of course...

    Read the article

  • Blank Page: wordpress on nginx+php-fpm

    - by troutwine
    Good day. While this post discusses a similar setup to mine serving blank pages occasionally after having made a successful installation, I am unable to serve anything but blank pages. My setup: Wordpress 3.0.4 nginx 0.8.54 php-fpm 5.3.5 (fpm-fcgi) Arch Linux Configuration Files php-fpm.conf: [global] pid = run/php-fpm/php-fpm.pid error_log = log/php-fpm.log log_level = notice [www] listen = 127.0.0.1:9000 listen.owner = www listen.group = www listen.mode = 0660 user = www group = www pm = dynamic pm.max_children = 50 pm.start_servers = 20 pm.min_spare_servers = 5 pm.max_spare_servers = 35 pm.max_requests = 500 nginx.conf: user www; worker_processes 1; error_log /var/log/nginx/error.log notice; pid /var/run/nginx.pid; events { worker_connections 1024; } http { include mime.types; default_type application/octet-stream; sendfile on; keepalive_timeout 65; gzip on; include /etc/nginx/sites-enabled/*.conf; } /etc/nginx/sites-enabled/blog_sharonrhodes_us.conf: upstream php { server 127.0.0.1:9000; } server { error_log /var/log/nginx/us/sharonrhodes/blog/error.log notice; access_log /var/log/nginx/us/sharonrhodes/blog/access.log; server_name blog.sharonrhodes.us; root /srv/apps/us/sharonrhodes/blog; index index.php; location = /favicon.ico { log_not_found off; access_log off; } location = /robots.txt { allow all; log_not_found off; access_log off; } location / { # This is cool because no php is touched for static content try_files $uri $uri/ /index.php?q=$uri&$args; } location ~ \.php$ { fastcgi_split_path_info ^(.+\.php)(/.+)$; #NOTE: You should have "cgi.fix_pathinfo = 0;" in php.ini include fastcgi_params; fastcgi_intercept_errors on; fastcgi_pass php; } location ~* \.(js|css|png|jpg|jpeg|gif|ico)$ { expires max; log_not_found off; } }

    Read the article

  • Odd squid transparent redirect behavior

    - by EMiller
    This is the first time I've set up squid. It's running a redirect script that does some text search/replace on html pages, and then saves them to a location on the same machine on the nginx path - then issues the redirect to that URL (it's an art project :D). The relevant lines in squid.conf are http_port 3128 transparent redirect_program /etc/squid/jefferson_redirect.py The jefferson_redirect.py script is based on this script: http://gofedora.com/how-to-write-custom-redirector-rewritor-plugin-squid-python/ The issue: I'm getting strange http redirect behavior. For example, here is the normal request/response from a PHP script that issues a header("Location:"); - a 302 redirect: http://redirector.mysite.com/?unicmd=g+yreka GET /?unicmd=g+yreka HTTP/1.1 Host: redirector.mysite.com User-Agent: Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.1.9) Gecko/20100330 Fedora/3.5.9-1.fc12 Firefox/3.5.9 Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Language: en-us,en;q=0.5 Accept-Encoding: gzip,deflate Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7 Keep-Alive: 300 Connection: keep-alive HTTP/1.1 302 Found Date: Tue, 13 Apr 2010 05:15:43 GMT Server: Apache X-Powered-By: PHP/5.2.11 Location: http://www.google.com/search?q=yreka Content-Type: text/html Vary: User-Agent,Accept-Encoding Content-Encoding: gzip Content-Length: 2108 Keep-Alive: timeout=3, max=100 Connection: Keep-Alive Here's what it looks like when running through the squid proxy (note that "redirector.mysite.com" is not the site running squid or nginx): http://redirector.mysite.com/?unicmd=g+yreka GET /?unicmd=g+yreka HTTP/1.1 Host: redirector.mysite.com User-Agent: Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.1.9) Gecko/20100330 Fedora/3.5.9-1.fc12 Firefox/3.5.9 Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Language: en-us,en;q=0.5 Accept-Encoding: gzip,deflate Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7 Keep-Alive: 300 Proxy-Connection: keep-alive If-Modified-Since: Tue, 13 Apr 2010 05:21:02 GMT HTTP/1.0 200 OK Server: nginx/0.7.62 Date: Tue, 13 Apr 2010 05:21:10 GMT Content-Type: text/html Content-Length: 17865 Last-Modified: Tue, 13 Apr 2010 05:21:10 GMT Accept-Ranges: bytes X-Cache: MISS from jefferson X-Cache-Lookup: HIT from jefferson:3128 Via: 1.1 jefferson:3128 (squid/2.7.STABLE6) Connection: keep-alive Proxy-Connection: keep-alive It is basically working - but the URL http://redirector.mysite.com/?unicmd=g+yreka remains unchanged, while displaying the google page (mostly broken as it's using URLs relative to redirector.mysite.com) I've experienced a similar thing with google results pages: when clicking to another page from google, I get a google URL, with the other site's content. Sorry for the long post - many thanks if you've read this far! Any ideas?

    Read the article

  • Change the language of fields in Microsoft Word

    - by Martin Wiboe
    Hi, I am using Word 2010 and some built-in features with fields, such as bibliography. My Word installation is English and I am writing a report in US English. However, my computer has its locale set to Denmark. This affects the formatting of dates and some of the text in the auto-generated fields (e.g. in bibliography it says "citeret:" instead of "cited:"). How can I change the language of the fields to US English? Thanks, Martin

    Read the article

  • HTTP Content-type header for cached files

    - by Brian
    Hello, Using Apache with mod_rewrite, when I load a .css or .js file and view the HTTP headers, the Content-type is only set correctly the first time I load it - subsequent refreshes are missing Content-type altogether and it's creating some problems for me. Specifically, gzip is not compressing these files. I can get around this by appending a random query string value to the end of each filename, eg. http://www.site.com/script.js?12345 However, I don't want to have to do that, since caching is good and all I want is for the Content-type to be present. I've tried using a RewriteRule to force the type but still didn't solve the problem. Any ideas? Thanks, Brian More Details: HTTP headers WITHOUT random query string value: http://localhost/script.js GET /script.js HTTP/1.1 Host: localhost User-Agent: Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.6; en-US; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 Accept: */* Accept-Language: en-us,en;q=0.5 Accept-Encoding: gzip,deflate Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7 Keep-Alive: 115 Connection: keep-alive Referer: http://localhost/ Cookie: PHPSESSID=ke3p35v5qbus24che765p9jni5; If-Modified-Since: Thu, 29 Apr 2010 15:49:56 GMT If-None-Match: "3440e9-119ed-485621404f100" Cache-Control: max-age=0 HTTP/1.1 304 Not Modified Date: Thu, 29 Apr 2010 20:19:44 GMT Server: Apache/2.2.14 (Unix) mod_ssl/2.2.14 OpenSSL/0.9.8l DAV/2 PHP/5.3.1 Connection: Keep-Alive Keep-Alive: timeout=5, max=100 Etag: "3440e9-119ed-485621404f100" Vary: Accept-Encoding X-Pad: avoid browser bug HTTP headers WITH random query string value: http://localhost/script.js?c947344de8278053f6edbb4365550b25 GET /script.js?c947344de8278053f6edbb4365550b25 HTTP/1.1 Host: localhost User-Agent: Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.6; en-US; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 Accept: */* Accept-Language: en-us,en;q=0.5 Accept-Encoding: gzip,deflate Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7 Keep-Alive: 115 Connection: keep-alive Referer: http://localhost/ Cookie: PHPSESSID=ke3p35v5qbus24che765p9jni5; HTTP/1.1 200 OK Date: Thu, 29 Apr 2010 20:14:40 GMT Server: Apache/2.2.14 (Unix) mod_ssl/2.2.14 OpenSSL/0.9.8l DAV/2 PHP/5.3.1 Last-Modified: Thu, 29 Apr 2010 15:49:56 GMT Etag: "3440e9-119ed-485621404f100" Accept-Ranges: bytes Vary: Accept-Encoding Content-Encoding: gzip Content-Length: 24605 Keep-Alive: timeout=5, max=100 Connection: Keep-Alive Content-Type: application/javascript

    Read the article

  • 4.5 Load average on single CPU

    - by Webnet
    I'm currently looking at... top - 16:27:37 up 27 min, 1 user, load average: 4.96, 3.75, 2.87 Tasks: 141 total, 6 running, 135 sleeping, 0 stopped, 0 zombie Cpu(s): 91.4%us, 6.9%sy, 0.0%ni, 0.0%id, 0.0%wa, 0.3%hi, 1.3%si, 0.0%st Mem: 514952k total, 507500k used, 7452k free, 5652k buffers Swap: 1044184k total, 281400k used, 762784k free, 89164k cached This is a single 2.0 Ghz CPU with 2 GB RAM Is it time for an upgrade? I'm watching and it seems to stick around 50% CPU "us" which I assume means usage.

    Read the article

  • openvz graphing user_beancounters

    - by Jona
    Hi there, We run a cluster of openvz servers and are looking for a way to automatically graph the content of user_beancounters for all ves. We currently have a fairly rudimentary cron which alerts us when limits are hit but we could like a graphing solution to show us history. Obviously we could roll our own using some fancy bash/php/perl and rrdtool but we're wondering if there are any existing solutions before we go down this path. We current run a cacti/snmp based graphing infrastructure.

    Read the article

  • Upgrade OEM Windows 7 Pro to Windows 7 Enterprise

    - by user17111
    Our hardware vendor has sold us a laptop that comes with Windows 7 Pro OEM. Since we want Windows 7 Enterprise (for features like DirectAccess and BitLocker), the vendor has supplied us with a Software Assurance only license to upgrade OEM Win 7 Pro to Win 7 Ent. Do I need to install Windows 7 Enterprise Volume License media and re-install Windows, or is there a process to turn this existing Windows 7 Pro install into an "Enterprise" ?

    Read the article

  • Windows 8 Speech Recognition Language

    - by Greg
    I've got Windows 8 Pro installed (RTM version from MSDN). For an application I use I need to have the speech recognition language set to English - US. The only option I have is English - UK. I have tried going to Language in Control Panel and setting the only language to English - US, however English - UK is still the only option in speech properties. How can I add a language to the Speech Properties?

    Read the article

  • How to retrieve virtual machines from a pool via API in oVirt (RHEV)

    - by FerCa
    In oVirt (Red Hat Enterprise Virtualization) you can create a Virtual Machines Pool to allow your users to retrieve virtual machines from this pool. I found how a user, in the RHEV User Portal, can request a Virtual Machine from the pool, this is explained here: https://access.redhat.com/knowledge/docs/en-US/Red_Hat_Enterprise_Virtualization/3.0/html/Evaluation_Guide/Evaluation_Guide-Allocate_VM.html The thing is that i will need to retrieve virtual machines from the pool with the REST API and, after reading the documentation (https://access.redhat.com/knowledge/docs/en-US/Red_Hat_Enterprise_Virtualization/3.0/html-single/REST_API_Guide/index.html) I cant found the way to do this.

    Read the article

  • networked storage for a research group, 10-100 TB

    - by Marc
    this is related to this post: http://serverfault.com/questions/80854/scalable-24-tb-nas-for-research-department but perhaps a little more general. Background: We're a research lab of around 10 people who do a lot of experiments that involve taking pictures at one of several lab setups and then analyzing it an one of several lab computers. Each experiment may produce 2 or 3 GB of data, and we are generating data at the rate of about 10 TB/year. Right now, we are storing the data on a 6-bay netgear readynas pro, but even with 2 TB drive, this only gives us 10 TB of storage. Also, right now we are not backing up at all. Our short term backup plan is to get a second readynas, put it in a different building and mirror the one drive onto the other. Obviously, this is somewhat non-ideal. Our options: 1) We can pay our university $400/ TB /year for "backed up" online storage. We trust them more than we trust us, but not a whole lot. 2) We can continue to buy small NASs and mirror them between offices. One limit, although stupid, is that we don't have an unlimited number of ethernet jacks. 3) We can try to implement our own data storage solution, which is why I'm asking you guys. One thing to consider is that we're a very transient population and none of us are network administration experts. I will probably be here only another year or so, and graduate students, who are here the longest, have a 5-6 year time scale. So nothing can require expert oversight. Our data transfer rates are low - most of the data will just sit on the server waiting for someone to look at it once or twice - so we don't need a really high speed system. Given these contraints, can someone recommend a fairly low-cost, scalable, more or less turn key shared data storage system with backup in a separate physical location. Does such a thing exist or should we just pay the university to take care of it for us? As a second question, our professor just got tenure and is putting together a budget. Here the goal is to ask for as much as you can and hope you get a fraction of it. So the same question, minus the low-cost. Without budget constraints, can you recommend a scalable turn-key backed up storage system. Thanks

    Read the article

  • Tools to Hide IP address for webapp

    - by Jake Barti
    I am looking out for a paid software where I can 'choose' an IP address from a different country and browse a site. So if I want to see how the site will look to US users, I should be able to choose the IP from US. We are building a web app that will be used in many countries and we want to make sure we test it before releasing. Any recommendations ?

    Read the article

  • Looking for a reliable Personal VPN Service

    - by user38673
    I am looking for a reliable personal VPN service so that I can access sites like Pandora or Hulu even when I am not physically in the US. I have been using StrongVPN but their service is not reliable. Here are some of my key requirements: Fast Reliable I don't mind paying a reasonable fee Unlimited traffic I just need PPTP support US IP addresses No software installation needed Supports Mac Any recommendations? Thanks.

    Read the article

  • windows xp mode for windows 7 - save text input language settings

    - by Gero
    When I change the 'default language' in 'text services and input languages' in windows xp mode from EN-US to DE-DE the settings are reverted with the next logoff / reboot - EN-US is the default language again. Is there a way around this behaviour? I'm using the default 'XPMUser' in windows xp mode. I also checked 'turn off advanced text services' and disabled the language bar and windows xp remembers these settings - just not the default language..

    Read the article

  • Watching Netflix through a VPN

    - by Sergio
    Recently I bought a VPS from DigitalOcean. I setup a PPTP VPN so I could watch us Netflix content from outside the US. Now that I have it setup and all my traffic is going through the VPN, Netflix is still showing my home country content. Pandora is working, and when I search my IP it shows im in NY, so I guess traffic is being routed correctly. I have also tried to delete flash settings and cookies from browser. Any ideas on what could be happening?

    Read the article

  • iPad cannot connect to youtube

    - by Mark
    I live in Holland and got my iPad from the US I have two problems: 1 Could not connect to App Store, after registering an US account it worked but far from ideal. 2. YouTube gives à cannot connect error. Does anyone know à work around for this?

    Read the article

  • Free-space driven log rotation on linux?

    - by kdt
    Someone just asked me 'how long should we keep logs for our application', and my answer was 'until the disk is full' as there's no reason to throw them away other than running out of space. However, standard logrotate wants us to specify a specific period + number of rotations. Is there something similar that would let us say "rotate daily, and keep as much history as you like until there is only 5% space free"? The platform is Redhat Linux.

    Read the article

< Previous Page | 37 38 39 40 41 42 43 44 45 46 47 48  | Next Page >