Search Results

Search found 22471 results on 899 pages for 'firefox os'.

Page 696/899 | < Previous Page | 692 693 694 695 696 697 698 699 700 701 702 703  | Next Page >

  • I love google Chrome, but some non-static pages like Piwik render it unresponsive

    - by gogowitsch
    The web-stat software Piwik stops reacting on mouse clicks after 1-2 seconds. The same is true for Google Maps and Producteev (but GMail and most other pages work like a charm). These rely heavily on JS, and work without Flash. I can click for a very short time period and then the mouse cursor doesn't feel the UI anymore (it doesn't turn into a I over input fields, though it moves; if the freeze occured while the pointer was over an input field, the cursor keeps being a I) and all clicks on the DOM are being ignored by Chrome. No message appears, neither obvious nor in the Console (F12). There is no obstructing div or the like in the DOM (F12). Since I couldn't find any hints on the source of my problems, I suspected my plugins and extensions. Unfortunately, neither deactivating all plugins nor all extensions solved the problem. for the problematic pages, it always happens no Dropbox running several GB of free RAM the taskmanager doesn't show any high CPU or memory utilization (the offending tab uses 30 MB and uses 0-1 % CPU) all problematic pages work in other browsers (Chrome, Firefox, IE) the rest of the computer is very responsive the computers use different security suites (Kaspersky and Avira) The effect exists between several (synchronized) Chrome instances on different machines, all running Windows 7. Both the OS and Chrome are updated automatically. Other tabs and the Chrome chrome (tabs, menus, toolbar buttons of the browser itself) still work. I really don't like switching between browsers. Any ideas?

    Read the article

  • Quick, Linux-compatible unit-aware calculator

    - by endolith
    I want to be able to press a keyboard combination, start typing a mathematical expression that includes units and slightly advanced math (not just a four-function calculator), and get a result immediately, in units that I specify, that I can copy and paste. Currently I open Firefox and press Ctrl+K, type in the search box, and it usually gives me a result in the drop-down from Google Calculator. It doesn't always, though, so I press "=" at the end, wait for a result, remove the equals, wait for a result, realize it doesn't understand the way I typed a unit, open the result in a new tab, etc. it sucks. Wolfram Alpha is smarter, but very much slower, and the output is all images, not text, and I don't have a quick widget for it, if such a thing could even exist. GNU units has a ton of units, which is great, and I can define my own units, which is great, but they have to be written in specific, unintuitive ways, it doesn't handle much advanced math, and I'd need to open a terminal, start units, etc. I hate the command line. I wasted a lot of time trying to make front-ends for units in Deskbar and Launchy, but I'm not a real coder and I don't use either of those anymore. Any other solutions or enhancements of these?

    Read the article

  • Google Chrome - Issues with download dialogs using BinaryWrite

    - by Mila
    Hello, I have an empty ASP .NET page with the code behind to support downloading of PDF files. The page is called from a link-like web control that has NavigateUrl set to this page. In short, I am using the following for streaming: Response.Buffer = false; Response.ClearHeaders(); Response.ContentType = "application/x-pdf"; Response.AddHeader("Content-Disposition","attachment; filename="MyPDFFile.pdf"); byte[] binary = (dataReaderRemote[DataPDFFieldName]) as byte[]; //dataReaderRemote[DataPDFFieldName] has previously retrieved data if (binary != null) { MemoryStream memoryStream = new MemoryStream(binary); int sizeToWrite = CHUNKSIZE; //CHUNKSIZE=1024 for (int i = 0; i < binary.GetUpperBound(0) - 1; i = i + CHUNKSIZE) { if (!Response.IsClientConnected) return; if (i + CHUNKSIZE >= binary.Length) sizeToWrite = binary.Length - i; byte[] chunk = new byte[sizeToWrite]; memoryStream.Read(chunk, 0, sizeToWrite); Response.BinaryWrite(chunk); Response.Flush(); } } Response.Close(); IE as well as Firefox bring the download prompt window asking you whether you wish to open or save the file, while the user remains on the same page containing the link. However, Google Chrome opens a new blank tab and downloads the file automatically. Is there any way to prevent Chrome from opening the extra blank and therefore useless tab? I am using the Google Chrome version 5.0.375.55 (Official Build 47796) on Windows XP. Thanks in advance! Mila

    Read the article

  • No external src ip in log files (my router ip appears instead)

    - by bongo_fury
    I recently retired my workhorse WRT54G router/AP in favor of a Linksys EA2700. Since then, all inbound traffic (bound to an Ubuntu 10.02 box running LAMP)logged to Syslog, Apache's error and access logs, etc. (all behind said router) is getting logged with a src ip of 192.168.1.1, that of the router's internal ip. For example, here is an old entry from apache's access.log: 74.82.68.20 - - [22/Feb/2011:10:14:34 -0600] "GET /assets/css/style.css HTTP/1.1" 304 154 "http://example.com/view.php?event_id=1" "BlackBerry8520/5.0.0.822 Profile/MIDP-2.1 Configuration/CLDC-1.1 VendorID/100" And here is one since switching the router: 192.168.1.1 - - [05/Oct/2012:21:29:25 -0500] "GET /somedir/print.css HTTP/1.1" 200 650 "http://example.com/somedir/" "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:15.0) Gecko/20100101 Firefox/15.0.1"** That first field is the problem. Each and every entry in every log shows an "external" IP of 192.168.1.1, which isn't very helpful. Any ideas? Much thanks from a n00b!

    Read the article

  • Problem configuring php-fpm with nginx

    - by Nisanio
    First of all: I'm not an expert in configuring things. This is very new for me, so, my apologies in advance. At work we have a Centos server. The guy who worked here before installed nginx. We need to made a php site, so, obviously, I need to set up php and make it work with nginx. Making short a very long tale, I had to replace the nginx binary with a new one (because the older was compile without fast-cgi), and I had to recompile and install php (because the new version has fpm). Then I struggle with the config files, making this nginx.conf (not all the file) user php; location ~ \.php$ { fastcgi_pass 127.0.0.1:9000; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME /scripts$fastcgi_script_name; include fastcgi_params; } and uncomment some parameters in php-fpm (to much to detail here, but the important is that group and user are "php") I never could start the php-fpm with the instructions of the book sudo /usr/sbin/php-fpm start But after look at the net, I found this sudo /usr/local/sbin/php-fpm --fpm-config=/usr/local/etc/php-fpm.conf This worked (I think) I restarted nginx. But... nothings happens with php... My calls to php files (via firefox) doesn't even appear in the log (/opt/nginx/logs/error.log) I'm really, really exhausted and lost... Could anyone help me, pleaaase.... :( Thanks in advance

    Read the article

  • Migrating to CF9: trouble getting JRun working with SSL

    - by DaveBurns
    I have a client on MX7 who wants to migrate to CF9. I have a dev environment for them on my WinXP machine where I've configured MX7 to run with JRun's built-in web server. I've had that working for a long time with both regular and SSL connections. I installed CF9 yesterday side-by-side with the existing MX7 install to start testing. The install was smooth and detected MX7, adjusted CF9's port numbers for no conflict, etc. Testing started well: MX7 over regular and SSL still worked and CF9 worked over regular HTTP. But I can't get CF9 to work with SSL. I installed a new certificate with keytool, FireFox (v3.6) complained about it being unsigned, I added it to the exception list, and now I get this: Secure Connection Failed An error occurred during a connection to localhost:9101. Peer reports it experienced an internal error. (Error code: ssl_error_internal_error_alert) I've been Googling that in all variations but can't find much help to get past this. I don't see any info in any log files either. FWIW, here's my SSL config from SERVER-INF/jrun.xml: <service class="jrun.servlet.http.SSLService" name="SSLService"> <attribute name="enabled">true</attribute>` <attribute name="interface">*</attribute> <attribute name="port">9101</attribute> <attribute name="keyStore">{jrun.rootdir}/lib/mykey</attribute> <attribute name="keyStorePassword">*deleted*</attribute> <attribute name="trustStore">{jrun.rootdir}/lib/trustStore</attribute> <attribute name="socketFactoryName">jrun.servlet.http.JRunSSLServerSocketFactory</attribute> <attribute name="deactivated">false</attribute> <attribute name="bindAddress">*</attribute> <attribute name="clientAuth">false</attribute> </service> Anyone here know of any issues re setting up SSL and CF9? Anyone had success with it? Dave

    Read the article

  • Easy solution to monitoring & blocking connections to non-malicious services, IP's, and tracking companies

    - by binarybunny
    Our family lives in the middle of nowhere, so the only high-speed internet available is Verizon's 3G mobile broadband. We have the highest package available, yet continually go over the 10GB limit and get charged $10 every 1GB we go over. We run a business from home, so stopping when we hit the limit is not an option. I've found the majority of connections are to Google, Microsoft, Akamai, Facebook, and other web service companies (mainly google). I know these are harmless connections, but when it costs money for them to monitor our web activity it becomes a serious problem. Here's some things I've done, but I'm sure there's something else that could help before blocking a huge set of IP ranges: stopped using windows (on my machine) use MVPS host file on all computers use firefox on all computers (with don't track me option) ad block plugin on all browsers blocking google updates blocking windows updates block images in browsers (when possible) use comodo (paranoia-level style of blocking..) virus-free computers with ESET NOD32 bought router and installed dd-wrt in attempt to block connections more diligently (and throttle bandwidth if it comes to that) Anything I'm missing? I know Google analytics is on almost all websites, as well as FB like buttons but I would like to be able to stop these connections without blocking use of google services like gmail, etc. Any ideas?

    Read the article

  • How do I troubleshoot a slow hard drive?

    - by Bruce Connor
    My computer is suffering of slow-downs and I'm not surprised (it's around 6 years old). Here's what I've verified: They are not very frequent (only a couple of times a day). When they happen a single application will hang for 10-60 seconds, while the rest don't hang but also get slow. Even as it is happening, the CPU usage stays low. It happens to applications (such as text editor, firefox, skype). It never happens to some applications (such as games) which I use for hours under heavy CPU load. Also of note: The Graphics card and PSU are new (around a year). Though I have a decent amount of software installed right now, this was happening even right after I reinstalled Windows. This HDD has been through many partinioning schemes, and a few heavy operations (such as moving around 200GB of data). Because of the above, I am already 70% sure the problem is with the hard drive. Before I replace it, however, I want to rule out other less likely possibilities (such as RAM, software, or PSU). I don't have the money to replace the entire box right now, but I can easily replace one of the components. I've read several questions (such as this one) which give general guidance on troubleshooting an unknown issue, that is not what I'm looking for here. My main question is: What tests or benchmarks can I run to verify I have a problematic hard drive? I don't need to solve this problem, I am content with just making sure it's the hard drive. I could borrow a newer hard drive from a friend and see if it gets better. A positive result would rule out all other components, but it wouldn't rule out a software issue (since this new hard drive won't have any of the software I use daily). Running on Windows/Linux.

    Read the article

  • Safari's location bar (auto-suggest and web search)

    - by Lri
    Auto-suggest don't seem to work for queries with spaces. Am I missing something? If you select an item from the suggestion list that was matched by its title, the title is filled in before the address. Can you change it to work like in other browsers? SMRT disables searching by title completely. Can you combine Top Hit, History and Bookmarks into a single section? The preferences starting with DebugSafari4 don't work anymore. (Like DebugSafari4IncludeFancyURLCompletionList.) Can you direct unresolved addresses to something like google.com/search?q=?&btnI instead of ?.com? Like by changing keyword.URL in Firefox. Can you remove or hide the web search field? In Camino, Cruz and Fluid it can be resized to zero width. You can't circumvent the normal maximum ratio with InputFieldWidthRatio. AddressBarIncludesGoogle doesn't appear do anything in the current version. Are there fixes or workarounds to any of these? I'm lumping these issues together, because they are closely related — a lot of them were introduced when the location bar was redesigned in Safari 5. I'm also hoping to find something like an extension or a plugin that would replace the standard location bar.

    Read the article

  • How do I SSH tunnel using PuTTY or SecureCRT through gateway/proxy to development server?

    - by DAE51D
    We have some unix boxes setup in a way that to get to the development box via ssh, you have to ssh into a 'user@jumpoff' box first. There is no direct connection allowed on 'dev' via ssh from anywhere but 'jumpoff'. Furthermore, only key exchange is allowed on both servers. And you always login to the development box as 'build@dev'. It's painful to always do that hopping. I know this can be done with SOCKS or a Tunnel or something... I have setup a FreeBSD VM and I can get things to work awesome using unix ssh tools. Basically all I do is make sure my vm's ~/.ssh/id_rsa.pub key is on both jumpoff and dev and use this ~/.ssh/config file: # Development Server Host ext-dev # this must be a resolvable name for "dev" from Jumpoff Hostname 1.2.3.4 User build IdentityFile ~/.ssh/id_rsa # The Jumpoff Server Host ext Hostname 1.1.1.1 User daevid Port 22 IdentityFile ~/.ssh/id_rsa # This must come below all of the above Host ext-* ProxyCommand ssh ext nc $(echo '%h'|cut -d- -f2-) 22 Then I just simply type "ssh ext-dev" and I'm in like Flynn. The problem is I can't get this same thing to work using either PuTTY or SecureCRT -- and to be honest I've not found any tutorials that really walk me through it. I see many on setting up some kind of proxy tunnel for Firefox, but it doesn't seem to be the same concept. I've been messing with various trial and error most all day and nothing has worked (obviously) and I'm at the end of my ssh knowledge and Google searching. I found this link which seemed to be perfect, but it doesn't work for me. The "Master" connects fine, but the "client" portion doesn't connect. It tells me, the remote system refused the connection. http://www.vandyke.com/support/tips/socksproxy.html I've got the VM, PuTTY and SecureCRT all using the same public/private key pairs to make things consistent and easier to debug. Does anyone have a straight up example of how to do this in Windows?

    Read the article

  • PHP output buffer settings ignored by server

    - by Ecom Evolution
    I have been trying to flush the output of certain scripts to the browser on demand, but they do not work on our production server. For instance, I tried running the "Phoca Changing Collation tool" (find it on Google) and I don't see any output until the script finishes executing. I've tried immediately flushing the buffer on other scripts that work fine on any server but this one using the following code: echo "something"; ob_flush(); flush(); Setting "ob_implicit_flush(1);" doesn't help either. The server is Apache 2.2.21 with PHP 5.2.17 running on Linux. You can see our php.ini file here if that will help: http://www.smallfiles.org/download/1123/php.ini.html This isn't the only problem we are having with the server ignoring in-script directives. The server also ignores timeout coding such as: ini_set('max_execution_time', 900*60); AND set_time_limit(86400); Script always times out at the php.ini default. Doesn't seem to matter if script is executed in IE or Firefox. Tried "ini_set('zlib.output_compression_level', 'Off');" and checked that it is "Off" in the php.ini file. The code "apache_setenv('no-gzip', 1);" causes a fatal error so tried uploading a .htaccess file with the "mod_gzip_on No" directive. Neither helps. Tried running Apache as fcgi and suphp, but same results. Server is NOT in safe mode. Pullin ma hair out!

    Read the article

  • Unable to Access Localhost after starting Xampp

    - by user7370
    OS: Windows XP Professional, SP 2. Few days back i had xampplite 1.7.1 installed and was able to access localhost and phpmyadmin through browser...... Today it suddenly stopped working. In firefox after i type http://localhost/ nothing happens, just blank white screen. I removed all the files in xampplite folders and re-installed ver 1.7.1 again, it's of no use. Then i installed xampplite 1.7.2 (latest), which i had downloaded from xampp website, again it's of no use. Apache and MySql are running though. I tried using locally installing wordpress, as i have a theme ready and want to convert that design to wordpress, test it and start using it online. Running 'Port-check' on xampp control panel showed this -- RESULT ------ Service -- -- Port -- -- Status -- --------------------------------------------------- Apache (HTTP) -- 80 -- C:\xampplite\apache\bin\httpd.exe Apache (WebDAV) -- 81 -- free Apache (HTTPS) -- 443 -- C:\xampplite\apache\bin\httpd.exe MySQL -- 3306 -- C:\xampplite\mysql\bin\mysqld.exe FileZilla (FTP) -- 21 -- free FileZilla (Admin) -- 14147 -- free Mercury (SMTP) -- 25 -- free Mercury (POP3) -- 110 -- free Mercury (IMAP) -- 143 -- free Mercury (HTTP) -- 2224 -- free Mercury (Finger) -- 79 -- free Mercury (PH) -- 105 -- free Mercury (PopPass) -- 106 -- free Tomcat (AJP/1.3) -- 8009 -- free Tomcat (HTTP) -- 8080 -- free --------------------------------------------- I also Have skype installed but it's not using 'Port 80' (as i have read, this was the issue, but checked under skype option the port is 65013). And when i run file:///C:/xampp/htdocs/index.php - it shows "Something is wrong with the XAMPP installation :-( " Please help with this problem. thanks Sharath kumar

    Read the article

  • WTH? Upgrading Ubuntu 9.10 to 10.04 Problem: No Internet Connection?

    - by damx
    Earlier today I tried upgrading my Ubuntu 9.10 Karmic Kaola from the update manager. (FYI: all 9.10 updates were applied prior to this) Everything was going well downloading until I got an error dialog informing me that some software packages weren't downloaded because of an Internet connection. Would say it was halfway thru. Anyway, Was told that the software packages that it did download, however, would kept and I figure it's not a big deal. Just run it again. But first ran Firefox to verify my connection as I haven't had any connection problems. But my internet connection was/is fine as evident by this posting. With that cleared, I ran the update manager again, clicked on "Upgrade" and this time I received "Could not download the release notes. Please check your internet connection" huh? This is my first time dealing w/ Ubuntu and my first upgrade so I am hoping someone can help. Not sure what the problem can be. I am can surf the web w/ no problems. Please help. PS: There was no installing at any point. Just downloading. PSS: The software it managed to download the first time around is now visible in the update manager but don't think I should install as I see in the compiz description it's for v1:0.8-4-0ubuntu2 I figure it's designed for 10.04 and might ruin things further if I install

    Read the article

  • Setting up vncserver on OpenSolaris zone

    - by k.park
    I am running OpenSolaris 5.10 and set up a sparse zone(inherits most of bin directories from global zone). I ended up copying many etc and var files from global zone, eventually most of the stuff(firefox,gvim, etc.) working through ssh via X11. However, I am having problems setting up vncserver on the zone. This is what I get if I tried to start the vncserver. vncext: VNC extension running! vncext: Listening for VNC connections on port 5911 vncext: created VNC server for screen 0 Fatal server error: could not open default font 'fixed' _X11TransNAMEDOpenClient: Cannot open /tmp/.X11-pipe/X11 for NAMED connection _X11TransOpen: transport open failed for local/%zone%:11 xsetroot: unable to open display '%zone%:11' _X11TransNAMEDOpenClient: Cannot open /tmp/.X11-pipe/X11 for NAMED connection _X11TransOpen: transport open failed for local/%zone%:11 _X11TransNAMEDOpenClient: Cannot open /tmp/.X11-pipe/X11 for NAMED connection _X11TransOpen: transport open failed for local/%zone%:11 _X11TransNAMEDOpenClient: Cannot open /tmp/.X11-pipe/X11 for NAMED connection _X11TransOpen: transport open failed for local/%zone%:11 vncconfig: unable to open display "%zone%:11" twm: unable to open display "%zone%:11" xterm Xt error: Can't open display: %zone%:11 I already chmoded /tmp/.X11-pipe with 777, and there is no pipe in /tmp/.X11-pipe or /tmp/.X11-unix directory. Here is my cat /etc/release: OpenSolaris 2009.06 snv_111b X86 Copyright 2009 Sun Microsystems, Inc. All Rights Reserved. Use is subject to license terms. Assembled 07 May 2009 BRAND: ipkg

    Read the article

  • Google Chrome not loading web pages correctly unless multiple refreshes

    - by Brandon Wilson
    Webpages in Google Chrome do not load correctly from time to time. I can't reproduce it, it just happens. Some times it happens when I load the browser other times it happens when I am just browsing. Just now I went to five different web sites which 3 out of 5 of them did not load correctly. I have attached a photo of how Super User loaded the first time I loaded it. If I refreshed it it will load correctly. Facebook is bad like this. Some times Facebook will load correctly but some of there back end scripting may not load so the page may not refresh automatically. Not sure what is going on. I have tried other browsers (Firefox and Internet Explorer) and they seem to be working correctly. Chrome seems to be acting up only on this computer. All my computers are running Windows 8 and I have removed Chrome completely off this computer and re-installed. I even disabled all extensions and cleared all the caches. I even tried running Chrome without being logged in. Not sure what else to do at this point. An example of superuser.com not loading correctly: When I refresh the problem will go away until it happens again. Sometimes it takes two or three refreshes in order for it to correctly load.

    Read the article

  • Apache 403 after configuring varnish

    - by w0rldart
    I just don't know where else to look and what else to do. I keep getting a 403 error on all my vhosts after setting varnish 3.0 Apacher log: [error] [client 127.0.0.1] client denied by server configuration: /etc/apache2/htdocs Headers: http://domain.com/ GET / HTTP/1.1 Host: domain.com User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:16.0) Gecko/20100101 Firefox/16.0 Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Language: en-US,en;q=0.5 Accept-Encoding: gzip, deflate DNT: 1 Connection: keep-alive Cookie: __utma=106762181.277908140.1348005089.1354040972.1354058508.6; __utmz=106762181.1348005089.1.1.utmcsr=OTHERDOMAIN.com|utmccn=(referral)|utmcmd=referral|utmcct=/galerias/cocinas Cache-Control: max-age=0 HTTP/1.1 403 Forbidden Vary: Accept-Encoding Content-Encoding: gzip Content-Type: text/html; charset=iso-8859-1 X-Cacheable: YES Content-Length: 223 Accept-Ranges: bytes Date: Sat, 01 Dec 2012 20:35:14 GMT X-Varnish: 1030961813 1030961811 Age: 26 Via: 1.1 varnish Connection: keep-alive X-Cache: HIT ---------------------------------------------------------- /etc/default/varnish: DAEMON_OPTS="-a ip.ip.ip.ip:80 \ -T localhost:6082 \ -f /etc/varnish/main.domain.vcl \ -S /etc/varnish/secret \ -s file,/var/lib/varnish/$INSTANCE/varnish_storage.bin,1G" #-s malloc,256m" My vcl file: http://pastebin.com/axJ57kD8 So, any ideas what I could be missing? Update Just so you know, ports: NameVirtualHost *:8000 Listen 8000 and <VirtualHost 205.13.12.12:8000>

    Read the article

  • Need troubleshooting advice for intermittent dns problems with requests on isp nameservers

    - by Mnebuerquo
    I've been having some intermittent dns problems with a web server, where certain isp's dns servers don't have my hostnames in cache and fail to look them up. At the same time, queries to opendns for those hostnames resolve correctly. It's intermittent, and it always works fine for me, so it's hard to identify the problem when someone reports connectivity problems to my site. My website is on a server running linux with Plesk. My dns records are configured with plesk (so my server is its own dns master). Domain name is registered with godaddy. I'm not real knowledgeable about dns, so I don't really know how to begin with troubleshooting. I've started learning to use dig, but while I can read the manpage to learn the syntax, I don't really know what questions to ask. Since the problem is intermittent I haven't been able to really catalog many symptoms. Symptoms I have observed: Certain people repeatedly reported intermittent problems connecting to my website. This was only from certain networks. (Ex: One guy could connect reliably from his office but not his home.) Sometimes I notice my browser taking a long time looking up the hostname for my site (Firefox shows a message in the status bar at the bottom). For me this is in the ten second range. ssh connections from anywhere to my server take a long time to connect but then seem to work fine once connected. So hopefully the folks on serverfault can point me to a good beginner tutorial for understanding dns, and suggest troubleshooting questions to ask next time one of my users reports connectivity problems.

    Read the article

  • Upgrading Ubuntu 9.10 to 10.04 Problem: No Internet Connection?

    - by damx
    Earlier today I tried upgrading my Ubuntu 9.10 Karmic Kaola from the update manager. (FYI: all 9.10 updates were applied prior to this) Everything was going well downloading until I got an error dialog informing me that some software packages weren't downloaded because of an Internet connection. Would say it was halfway thru. Anyway, Was told that the software packages that it did download, however, would kept and I figure it's not a big deal. Just run it again. But first ran Firefox to verify my connection as I haven't had any connection problems. But my internet connection was/is fine as evident by this posting. With that cleared, I ran the update manager again, clicked on "Upgrade" and this time I received "Could not download the release notes. Please check your internet connection" This is my first time dealing with Ubuntu and my first upgrade so I am hoping someone can help. Not sure what the problem can be. I can surf the web with no problems. PS: There was no installing at any point. Just downloading. PSS: The software it managed to download the first time around is now visible in the update manager but don't think I should install as I see in the compiz description it's for v1:0.8-4-0ubuntu2 I figure it's designed for 10.04 and might ruin things further if I install.

    Read the article

  • Windows 7 "freezes" (chills?), and then "unfreezes" after about 1 minute.

    - by gbc001
    Hi, I have an Acer Timeline 1810T netbook (4GB RAM) with Windows 7 x64. About once or twice a day, it "freezes" - the reason I put this in quotation marks is that it does not really freeze, as I you cant move mouse, etc. I can move my mouse and jump between different applications, but I cant use the applications for anything. So I can jump between notepad and Firefox, but I cant browse to a new web page. I have been trying to determine the source of this misery for a while now, and I suspect it has something to do with the hard drive - indirectly if not directly. Here are some screen shots of the resource monitor during a "freeze" and during normal operation: Freeze: http://imgur.com/Gcgq1.jpg Normal operation: imgur.com/mlHaI.jpg As you can see, CPU is fine during freeze, but the disk is going bananas.. Does anyone have an idea of what these reading means, or about the problem in general? There seems to be no specific activity that sets this off - it can be during browsing, or during media playback with nothing else open. The fact that I can see navigate to applications but not use them might suggest a hard drive problem as well? Maybe I can access the stuff that is in RAM, but not anything that would require interaction with the drive.. Very appreciative of any help!

    Read the article

  • wildcard deal with www as a subdomain

    - by Alaa Gamal
    i am using wildcard with apache my APACHE CONFIG: ServerAlias *.staronece1.com DocumentRoot /staronece1/domains my named file $ttl 38400 staronece1.com. IN SOA staronece1.com. email.yahoo.com. ( 1334838782 10800 3600 604800 38400 ) staronece1.com. IN NS staronece1.com. staronece1.com. IN A 95.19.203.21 www.staronece1.com. IN A 95.19.203.21 server.staronece1.com. IN A 95.19.203.21 mail.staronece1.com. IN A 95.19.203.21 ns1.staronece1.com. IN A 95.19.203.21 ns2.staronece1.com. IN A 95.19.203.21 staronece1.com. IN NS ns1.staronece1.com. staronece1.com. IN NS ns2.staronece1.com. staronece1.com. IN MX 10 mail.staronece1.com. * 14400 IN A 95.19.203.21 *.staronece1.com IN A 95.19.203.21 my php test file /staronece1/domains/index.php <?php function getBname(){ $bname=explode(".",$_SERVER['HTTP_HOST'],2); return $bname[0]; } echo 'SubDomain is :'.getBname(); ?> if i go to something.staronece1.com i get this result SubDomain is : something No the problem is if i go to www.staronece1.com i should get empty result, because www is not a sub domain but i get this result SubDomain is : www And if i go to www.something.staronece1.com i get firefox error message ( site not found ) How to fix this problem?? i think the solution is: added record for www in named file Thanks

    Read the article

  • Cleaning a proxy/phishing trojan from Windows XP computer

    - by i-g
    I am trying to remove an interesting trojan from a Windows XP computer. It manifests itself as a phishing page (screenshot linked) that appears after the user tries to log on to eBay. So far, I haven't found any other web sites that are affected. As you can see, the trojan intercepts browser connections (all installed browsers are affected) and injects this phishing page. The address looks like it's ebay.com, but HTTPS verification doesn't work (no lock icon or green bar in Firefox.) At some point, Trojan.Dropper appeared on the computer. I removed it with Malwarebytes Anti-Malware. Although it reappeared several times, it seemed to be gone after I booted into Safe Mode and did a full system scan with MBAM. Now, however, a different trojan has appeared on the machine; I suspect it was installed by Trojan.Dropper. So far, MBAM, Ad-Aware, and Spybot S&D have been unable to remove it. I've looked for it in the HijackThis log but haven't found anything conclusive. Has anyone run across a trojan like this before? Where would I start looking for it to remove it manually? Thank you for reading.

    Read the article

  • How do I prevent or override a group policy on Windows 7?

    - by Kevin
    A few months ago my company was purchased by a large corporation. We recently switched our network over to the large corporate network which has more restrictions requirements. One of these is the requirement to use a proxy server for Internet traffic. However, some of our internal servers are not recognized by the corporate DNS, so we need to provide the fully qualified domain name. For W7, we make changes to the Internet Properties for IE8 and Chrome to include our domain name as an exception to the proxy server (e.g., *.foobar.com). The problem is that a group policy that does not include our domain name is continually pushed out to my systems throughout the day. This requires me to make the appropriate changes to the Internet Properties several times a day in order to access our internal servers. Is there a way that I can prevent the group policy from being pushed to my systems or detect when the group policy is pushed and override it? I am an administrator on all of my systems. I do have Firefox installed which is not subject to the same group policy push, but I need to have IE8 and Chrome working.

    Read the article

  • Linking to network shares from Sharepoint pages

    - by Russell C
    So the place I work decided to set up a Microsoft Sharepoint 2010 server for task management and I (as the lowly entry-level intern) have been tasked with "figuring it out." One thing that the end users really, really, really, want is the ability to link to network shares (that are readable by anyone who will be using sharepoint) from a Sharepoint web page. In order to do this, I have edited the HTML manually with several lines that look like the following: <a href="file://server/share">Server Share</a> This works (sometimes) but the link reported by Sharepoint is often wrong and editing pages that contain these links will mangle the code such that when I open it, the code no longer looks like what it did when I last hit save (breaking all those links). Obviously this is not sustainable. I've been told by coworkers that "It worked that way at the last place I worked" but I haven't found out how yet. Any ideas on how this would work or am I barking up the wrong tree? None of the knowledge searches I've done shed any light on the sitataion. Thanks for any help! -Russell P.S. It should be noted that the file option in an href tag ONLY works in IE (which is a real bummer since we mostly use Firefox).

    Read the article

  • What could cause a huge packet loss in Ubuntu 9.10, for both wired and wireless?

    - by xzenox
    I was previously using 9.04 fine (and in fact, I am posting this from my old 9.04 live cd). I tested the following install steps in a virtualbox vm prior to following the sames ones to upgrade my laptop: Download/burn ubuntu minimal cd (12mb one) Install ubuntu minimal sudo apt-get update sudo apt-get upgrade sudo apt-get ubuntu-desktop ubuntu-standard In the VM worked fine and I found myself with a working 9.10 ubuntu, network worked fine and I was able to test my backups and DropBox without a hitch (host was 9.04). When I followed the same steps on my laptop, everything worked up to after 9.10 being installed and working. As far as I can tell, everything besides eth0/wireless works. For some reason, I am unable to access the internet. Ping reports that over 99% of packets get lost (over an hour or so of pinging). This means for example that if I try hard enough, I can load a webpage but only at the cost of much patience... This happens both for a wired and wireless connection to my wrt310n (updated with latest firmware). At first I thought that it could be related to the ipv6 issues ppl have been experiencing however even after disabling ipv6 at the kernel level (through grub), I still get the issue. I do not think this is related to DNS issues or the likes since even when I ping my ISP's gateway IP, I have the same amount of packet loss. No DNS resolving should be required there. Access to my router works peachy with no packet loss there. I've tried different MTU values but to no avail. Note that this issue affects every web-enabled application: firefox, ping, synaptic, etc. The same hardware/router combo works with 9.04 but not with 9.10. In fact, when I did: sudo apt-get ubuntu-desktop ubuntu-standard after 9.10 minimal was installed, it downloaded over 400mb of packages without a hitch so my guess is that one of those packages either in ubuntu-desktop or ubuntu-standard is causing havok. Thoughts?

    Read the article

  • mod_deflate doesn't work [closed]

    - by kikio
    I want to gzip my static files. so put this in .htaccess: <IfModule mod_deflate.c> AddOutputFilterByType DEFLATE text/text text/html text/plain text/xml text/css application/x-javascript application/javascript </IfModule> and looked for mod_deflate in phpinfo() output Loaded Modules section, and I found it. But when I track server responses with Firebug, no gzipped file can be found: HTTP/1.1 200 OK Date: Sat, 08 Sep 2012 21:41:21 GMT Last-Modified: Sat, 08 Sep 2012 21:26:04 GMT Accept-Ranges: bytes Cache-Control: max-age=604800 Expires: Sat, 15 Sep 2012 21:41:21 GMT Vary: Accept-Encoding Keep-Alive: timeout=3, max=50 Connection: Keep-Alive Content-Type: text/css Content-Length: 18206 What's the problem? I'm sure I have mod_deflate enabled (according to php apache_get_modules()). UPDATE: the request headers: GET /d/jquery-ui.css HTTP/1.1 Host: 127.0.0.1 User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:15.0) Gecko/20100101 Firefox/15.0.1 Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Language: en-us,en;q=0.5 Accept-Encoding: gzip, deflate DNT: 1 Connection: keep-alive Pragma: no-cache Cache-Control: no-cache

    Read the article

< Previous Page | 692 693 694 695 696 697 698 699 700 701 702 703  | Next Page >