Search Results

Search found 18035 results on 722 pages for 'location bar'.

Page 571/722 | < Previous Page | 567 568 569 570 571 572 573 574 575 576 577 578  | Next Page >

  • HTTP request hangs for for exactly 150 seconds, then gives incomplete response. How do I find out wh

    - by Nathan
    I am hosting a Wordpress blog, and having a strange problem. When I connect to the server (http://71.65.199.125/ at the time of this writing) it displays the Title correctly, and half of a download bar, indicating it has received some of the page, then it hangs for exactly 150 seconds (timed it twice), then it sends the rest of the page, but without the stylesheet. after that it hangs indefinitely, continuing to say "connecting..." without making any progress. If you have any clues as to what might be happening, or how I could print debug logs of PHP or something to see what it is looking for during that hang time that would probably help. recent changed I have made: switched wordpress themes, however I did see it work once with the new theme. moved the server to another building, with an identical ISP, and linksys router forwarding setup. I have also added a favicon.gif file to /var/www but without linking to it from any of the wordpress pages. I have also had a unanticipated power interruption. System info: Ubuntu debian 9.04 Apache2 PHP 5 Wordpress 2.9.2 Thank you

    Read the article

  • How does Tunlr work?

    - by gravyface
    For those of you not in the US, Tunlr uses DNS witchcraft to allow you to access US-only (and UK-only stuff like BBC radio online) services and Websites like Hulu.com, etc. without using traditional methods like a VPN or Web proxy. From their FAQ: Tunlr does not provide a virtual private network (VPN). Tunlr is a DNS (domain name system) unblocking service. We’re using sophisticated technologies (a.k.a. the Tunlr Secret Sauce ©) to re-adress certain data envelopes, tricking the receiver into thinking the envelope originated from within the U.S. For these data envelopes, Tunlr is transparently creating a network tunnel from your location to our U.S.-based servers. Any data that’s not directly related to the video or music content providers which Tunlr supports is not only left untouched, it’s also not even routed through Tunlr. In order to use Tunlr, you will have to change the DNS address. See Get started for more information. I can't really wrap my head around how this works; I have always assumed that these services performed a geolocation lookup via your client IP. Just really curious as to how this works. EDIT 2 I believe they're only proxying the initial geo check and then modifying the data stream request to include your real IP address so that the streaming is direct, not proxied.

    Read the article

  • Updated my WAMP Server and MySQL is eating up 580mB of memory

    - by Jon
    I updated my dev-box's WAMPSERVER, and along with updating PHP and Apache, MySQL updated to '5.6.12'. After doing that, I copied the data folder from my old (5.1.36) install to the new one and now MySQL takes up 580mB which is way too much, since I'm the only person using it (Locally) and there are only 20 or so databases on it, none of which have 'memory' tables. How can I get this down to a decent amount? My my.ini: # For advice on how to change settings please see # http://dev.mysql.com/doc/refman/5.6/en/server-configuration-defaults.html # *** DO NOT EDIT THIS FILE. It's a template which will be copied to the # *** default location during install, and will be replaced if you # *** upgrade to a newer version of MySQL. [mysqld] # Remove leading # and set to the amount of RAM for the most important data # cache in MySQL. Start at 70% of total RAM for dedicated server, else 10%. # innodb_buffer_pool_size = 128M # Remove leading # to turn on a very important data integrity option: logging # changes to the binary log between backups. # log_bin # These are commonly set, remove the # and set as required. # basedir = ..... # datadir = ..... # port = ..... # server_id = ..... # Remove leading # to set options mainly useful for reporting servers. # The server defaults are faster for transactions and fast SELECTs. # Adjust sizes as needed, experiment to find the optimal values. # join_buffer_size = 128M # sort_buffer_size = 2M # read_rnd_buffer_size = 2M sql_mode=NO_ENGINE_SUBSTITUTION,STRICT_TRANS_TABLES Database info: Storage Engine Data Size Index Size Total Size InnoDB 48.00 KB 0.00 B 48.00 KB MEMORY 0.00 B 0.00 B 0.00 B MyISAM 163.64 MB 122.49 MB 286.13 MB Total 163.69 MB 122.49 MB 286.18 MB

    Read the article

  • Apache reverse proxy with VirtualHost not serving a page

    - by Mr Aleph
    I have an Apache reverse proxy set to move requests to a Tomcat Applet. The config is similar to: <VirtualHost 100.100.100.100:80> ProxyPass /AppName/App http://1.1.1.1/AppName/App ProxyPassReverse /AppName/App http://1.1.1.1/AppName/App </VirtualHost> I also have a page called summary.html that exists on 1.1.1.1 as: http://1.1.1.1/AppName/summary.html When I browse directly to it I have no problem viewing it, however if I try to get there via the reverse proxy I get a blank page. Wireshark shows me a 503, but this one is coming from the Apache reverse proxy (IP 100.100.100.100) and not the Tomcat (IP 1.1.1.1). Should I add http://1.1.1.1/AppName/ to the config? How? I tried it but I get a blank page, however this one shows on the URL bar of the browser the internal IP of the Tomcat, so, no go. Help is appreciated. Thanks. EDIT: This is the dump from Wireshark: GET /AppName/ HTTP/1.1 Host: 100.100.100.100 User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_6_8) AppleWebKit/534.52.7 (KHTML, like Gecko) Version/5.1.2 Safari/534.52.7 Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Cache-Control: max-age=0 Accept-Language: en-us Accept-Encoding: gzip, deflate Connection: keep-alive HTTP/1.1 404 Not Found Date: Tue, 30 Jan 2012 09:08:51 GMT Server: Apache Content-Length: 1 Connection: close Content-Type: text/html; charset=iso-8859-1

    Read the article

  • Internet Explorer / Windows 7 does not want to show HTML file from local network drive

    - by Jaanus
    Setup: I have Windows 7 running inside VirtualBox on Mac OS X host. I have a shared drive with some HTML files, that I am mounting as a local drive W: in Windows, from the VirtualBox server \VBOXSVR. I want to look at them with a browser in Windows. Chrome in Windows 7 opens and shows those HTML files just fine (file:///W:/welcome.html). But Internet Explorer does not, and shows this error instead of the files: Internet Explorer cannot display the web page What you can try: [button Diagnose Connection Problems] More information This problem can be caused by a variety of issues, including: Internet connectivity has been lost. The website is temporarily unavailable. The Domain Name Server (DNS) is not reachable. The Domain Name Server (DNS) does not have a listing for the website's domain. If this is an HTTPS (secure) address, click Tools, click Internet Options, click Advanced, and check to be sure the SSL and TLS protocols are enabled under the security section. For the internet zone in the status bar, it shows: Internet | Protected Mode: On IE settings are a mystery to me, and I could possibly get it to work by tweaking IE settings, but I don't know which ones. How do I make IE show the same files that Chrome is happy to show? (Chrome showing them means that the files themselves are fine, there is something about the setup that just makes IE be a diva.)

    Read the article

  • Windows XP / Outlook 2003 error messages

    - by AboutDev
    Can anyone help with this issue? I am trying to help someone and could use some expertise. Error Message #1: Microsoft Office Small Business Edition 2003 With CD icon "The feature you are trying to use is on a CD-ROM or other removable disk that is not available. Insert the 'Microsoft Office Small Business Edition 2003' disk and click OK. Use source: Microsoft Office Small Business Edition 2003" 1st got this message after CD was inserted to recover partial file STDP11N. Recovered STDP11N, however, still receiving pop up window with error message each time outlook opens. Had accidentally cleaned up old programs and suddenly this was missing. Reinstalled Microsoft Office Small Business Edition 2003 using install CD. Outlook worked buit keep getting error message pop up each time I open Outlook. Hit ok. Error Message #2: The path 'Microsoft Office Small Business Edition 2003' cannot be found. Verify that you have access to this location and try again, or try to find the installation package 'STDP11N.MSI' in a folder from which you can install the product Microsoft Office Small Business Edition 2003." Hit ok. Back to error message #1 Hit close window Error message #3: Error 1706. Setup cannot find the required files. Check your connection to the network, or CD-ROM drive. For other potential solutions to this problem, see C:\Program Files\Microsoft Office\ OFFICE11\1033\SETUP.CHM Error message #4 I'd created a file under D: drive on an external drive. "The path specified for the file D:...etc.. .pst is not valid. Hit ok. Brings up window to look in My Documents.

    Read the article

  • How do I restore to a delta file (disk) on Vmware ESXi

    - by Oscar
    Using VMware Server ESXi (freebie version) I have a Virtual Machine (win 2k3 r2 server). When I first provisioned it I took a snapshot of it. I recently tried to clone the primary drive using my standard hardware-based method to grow a windows disk. (using knoppix, clone drive to a new drive, make it bootable, then I intended to extend the partition via diskpart from within windows). This process failed; I tried setting the cloned drive (via the vmware gui) to replace the original drive, boot and be done. This didn't work out so well. The machine never booted. I checked the boot order, the disk location and all the basics I usually do. As a failsafe, I then tried changing all the settings back so the machine would boot to the original drive and I could figure out (as I eventually did) a better way of growing the disk. However when I powered on the machine with the original drive, it reverted back to that initial snapshot I created; It lost all the changes since. I looked in the file system and found a few files, I think the keyfile here is one named "delta" and I'm assuming that's the disk I want, but I can't find a way to have the Virtual Machine actually use that drive/file. It isn't available to add when I go to add an existing drive. Do I need to somehow commit that delta to the original drive and then boot from it again? Can you point me in the right direction? I've since discovered the proper way of growing drives using "vmkfstools" but I need to get back to the original state of the machine to try this out. Any help would be greatly appreciated.

    Read the article

  • Modify or disable Windows 8 swipe gestures on touchpad / laptop

    - by Matsemann
    I have an ASUS G75VW laptop with a Synaptic touchpad (/trackpad). When I move my finger from one edge towards the middle (the swipe), Windows 8 will bring up different stuff. This is a problem because the area where I can actually move the mouse with my finger is too small (or, I mostly use the top left of the touchpad). So I often end up doing a swipe and bringing up some menu, or to do the swipe so slow that no menu is appearing but the mouse pointer is also not moving when I move my finger. Quite annoying. When swiping from left edge it earlier swapped apps like crazy. I disabled that, so now it only brings up the same menu as pressing win+tab (or some times the charms bar, I never know which). I could change that by: Win+I - Change PC settings - General - When I swipe from the left edge, switch directly to my most recent app I've tried Mouse settings in Control Panel, driver settings for my touchpad and searching for swipe and gestures on my computer (which was what led me to the setting above) with no luck. How can I disable the swipe gestures, or change what they do? Thanks.

    Read the article

  • Completed downloads freeze Windows

    - by Ben Hooper
    The Issue Shortly after a file download via Google Chrome for Windows completes, the download will get stuck on "0 seconds left" and all other programs (except Google Chrome, for some reason, but browsing will not work) completely freezes into Windows' infamous "Not Responding" state, affecting Explorer particularly badly. Eventually, the programs will recover themselves but they will recover significantly faster if you cancel the file download, relative to how quickly you react. Performing the exact same operation immediately after cancelling the download usually works without issue. This issue occurs when with any file type (.ZIP, .MSI, .MSG, .PNG, .URL, etc) of any size from any source (Dropbox, SourceForge, Imgur, even tiny and locally-generated BLObs created by my own Chrome extension, etc) to any location.   Potential Causes As this issue is so inconsistent, I haven't been able to prove whether the issue is Chrome-specific or being caused by my system or my Chrome configuration but it's happening on both my work and home PCs. I originally suspected that this issue was being caused by security software scanning completed downloads for threats but I'm not as confident in that theory anymore as the issue persisted even after changing my security software from ESET NOD32 and Malwarebytes Anti-Malware Pro to ESET Endpoint to Microsoft Security Essentials.   System Information (of both PCs) Windows version: 7 Service Pack 1 64-bit Google Chrome version: 30.0.1599.101 (but has been happening for a long time)   Screenshots

    Read the article

  • Delete on windows vista and seven -- discovery process

    - by M'vy
    Hi SUs! I've recently encountered a problem. Using svn at work I needed to clear some space. As you may know svn directories are full of sub-directories and files. So the delete process begins with a step of discovering the items to be deleted (I guess this is for displaying the progress bar). But in my case it ended up to be still running after I watched Braveheart (Off-topic: good film in my opinion. On-topic: and it last 2h50) and counting 440 000+ files. I finally decided to cut off the process and use the good old cmd with a del <directory> to do the job. (Done in some minutes) So I'm wondering if someone know how to override the system to make it actually begins the process while scanning the other items? At the end, I just want the file to be deleted and I don't care the number of files to be deleted. On the contrary I care about the time it takes. Thanks

    Read the article

  • How to set up daisy-chained routers for separate sub-nets?

    - by joe
    This question seems to be similar to others, but I'll take a shot anyway. A client recently switched ISPs from TDS to Comcast Business Class. Before the switch, they had 5 static IP addresses assigned. Now they'll have a single IP address that will change whenever Comcast decides to do so. The issue is that this internet connection will be shared among two companies, both having (and wanting to keep) their own private subnets. Because TDS was supplying multiple IP addresses to the one location, this allowed me to put each router on the switch. Now, with Comcast, they only get one IP address, meaning there has to be a main router before the subnet routers. Luckily, the cable modem has a built-in router, which I would like to connect to each company's router, and still have DHCP enabled on all accounts. Question: What do I need to do to the subnet routers to keep them separate from each other, but still allow internet access from the main router. I would love to say "I tried this", and give you links, but everything I find on the internet only mentions daisy-chaining routers with DCHP disabled.

    Read the article

  • Interesting phenomenom with Windows Server 2008 R2 user access controls and NTFS ACLs

    - by Simon Catlin
    One to try, and I'd appreciate any thoughts on this. On a Windows Server 2008 R2 box (or presumably 2008 R1, Windows Vista or Windows 7): i) Logon as an administrator, and create a new NTFS volume ii) Blow away the standard MS ACLS on the root of the volume (which are laughable), and replace with Administrators:Full Control, System:Full Control, e.g.: echo Y|cacls.exe d:\ /g "Administrators:F" "SYSTEM:F" iii) Now, from a Command Prompt shell window or PowerShell window, switch to that drive (cd /d D:\ or set-location D:\ ). Works fine... no issues. iv) Now, try to browse to the root of the new volume using MS Explorer... Access denied. Now, I've kind of convinced myself that it is UAC getting in the way, as you can add "Authenticated Users:List" access to D:\ and Explorer then works. I can only assume that MS Explorer isn't able to use the "admin" token for the Administrator. Browsing to explorer.exe and doing a "Run as administrator" has no effect. Any thoughts? Cheers in advance.

    Read the article

  • How can one use online backup with large amounts of static data?

    - by Billy ONeal
    I'd like to setup an offsite backup solution for about 500GB of data that's currently stored between my various machines. I don't care about data retention rates, as this is only a backup of, not primary storage, for my data. If the backup is stored on crappy non-redundant systems, that does not matter. The data set is almost entirely static, and mostly consists of things like installers for Visual Studio, and installer disk images for all of my games. I have found two services which meet most of this: Mozy Carbonite However, both services impose low bandwidth caps, on the order of 50kb/s, which prevent me from backing up a dataset of this size effectively (somewhere on the order of 6 weeks), despite the fact that I get multiple MB/s upload speeds everywhere else from this location. Carbonite has the additional problem that it tries to ignore pretty much every file in my backup set by default, because the files are mostly iso files and vmdk files, which aren't backed up by default. There are other services such as EC2 which don't have such bandwidth caps, but such services are typically stored in highly redundant servers, and therefore cost on the order of 10 cents/gb/month, which is insanely expensive for storage of this kind of data set. (At $50/month I could build my own NAS to hold the data which would pay for itself after ~2-3 months) (To be fair, they're offering quite a bit more service than I'm looking for at that price, such as offering public HTTP access to the data) Does anything exist meeting those requirements or am I basically hosed?

    Read the article

  • Windows 7 ICS client web failure

    - by n8wrl
    I have several windows 7 PC's connected on a LAN via a hub. One has a Verizon 3G connection and works great. I have internet connection sharing enabled on it, which automagically set the LAN connection to 192.168.137.1 and enabled DHCP. I am trying to get the client PC's working one at a time. The others are off. The client is able to: Get an IP via DHCP with correct settings. Ping any web address I can throw at it, so DNS and routing are working. Windows update works. But web sites hang in IE. All but google.com! I type www.msn.com, microsoft.com, amazon.com, etc. etc. All ping via a cmd window but IE just hangs - it says web site found but the green progress bar just slowly creeps and no content displays. www.google.com comes up even after clearing browser and dns cache. I am pulling my hair out - what am I missing? EDIT: After some more gyrations with a router I'm back to ICS. Same symptoms, only now I have an answer to Andrew's question, YES I can do Google searches but clicking on any of the result links hangs! Let one sit for half an hour with no timeout or error.

    Read the article

  • Shortcut to "printer and faxes" on another computer

    - by Doltknuckle
    I have a print server running windows server 2008 that has about 50 printers on it. In windows XP, I was able to connect to the server using the UNC name and make a shortcut to the "printers and faxes" folder. (For the record, I know that it really isn't a folder, but that's outside the scope of this question.) I have recently switched to windows 7 and I find that the jump lists are really useful. One of the things I want to do is make it easy to connect to that server's "printers and faxes" folder. I would like to use something like a shortcut that I can open and go immediately to that location. The problem is that windows 7 doesn't have a way to create a shortcut like you could in WinXP. They have a button on the toolbar that says "view remote printers" which sends you to the correct folder. I'd like to avoid having to type out the server name. I also can't use the "view network" link in windows explorer. Our organization has over 6,000 machines and viewing the network lists all of them. This is all about saving time by using the minimum number of mouse clicks and key presses in normal operation. Does anyone have any suggestions?

    Read the article

  • Forcing programs to be installed to another drive

    - by zyboxenterprises
    I have an SSD as my main Windows drive, with a 640GB 2.5" HDD, partitioned to store programs and user settings, and also to act as backup (it's the only thing I had lying around at the time of building my PC). The task was to make the PC as fast as possible, while having an increased storage capacity available to store normal user data, and to assist in my small data recovery business. The problem is that whenever I install a program, it installs to C:\Program Files [(x86 for the 32 bit programs]\, although I have changed the environment variables. This wouldn't normally be an issue, however every installation program points its shortcut to my 640GB HDD. The root layout of both drives: To clarify: Program files get installed to C:\ Program shortcuts are always pointed to Z:\, my 640GB HDD Modifying the relevant environment variables doesn't do anything, I looked at this, but however it only talks about modifying the registry and environment variables, which I have already done so. I install to the Z:\ drive if the installation program lets me change the installation path, but however the installation programs sometimes don't let me change this. Is there a way that I can force every program to install to the relevant location on Z:\? Perhaps I'm missing something here? Edit: Found this program; would it be appropriate to use in my case? I would be able to move the entire Program Files (and its x86 version) to Z:\, without impacting on the performance.

    Read the article

  • Implementing an isolated guest WLAN via IPSec VPN on Windows

    - by sysadmin1138
    We are attempting to set up a guest WLAN network that is isolated from the rest of our network. This is proving difficult due to a couple of technical reasons. My first choice was to use a separate VLAN, on which our Firewall's handy WLAN port would handle DHCP, DNS and the network isolation we need. Unfortunately, due to the fact that our main office and our Internet connection itself are in different locations connected by way of a Metro Ethernet connection, I'm at the mercy of our ISP for VLAN transit. They won't pass a second VLAN between our two sites. And my hardware doesn't support 802.1ad "Q-in-Q", which would also solve this problem. So I can't use the VLAN method for isolation. At least not without spending money. As our Firewall can handle IPSec site-to-site VPN connections, I hope it is possible to connect a Server 2008R2 (standard) server I have in the office location to the WLAN and provide gateway services to the firewall. Thusly: Unfortunately, I don't know if it is possible to connect the two this way. The firewall has a pretty flexible IPSec/L2TP implementation (I've used it to connect iPads in the wild), but is neither Kerberized or supports NTLM. The Connection Security Rules view on the Windows server seems to get close to what I think needs to be done, but I'm failing on figuring out how to get it to do what I need it to do. Is this even possible, or do I need to pursue alternate solution?

    Read the article

  • Outlook 2007 + Exchange 2010 (Save All Attachments)

    - by RobertPitt
    About 3 weeks back our company upgraded our mail system to Exchange 2010, all went smooth, few issues but nothing major. A few days ago we had a call from a colleague where he was unable to save all attachments, From File > Save As > Save All Attachments. When the email has a single attachment it works perfectly normal, and depending on the file type it allows you to save multiple attachments. But there's a lot of file types that will not work, such as zip, pdf, doc etc, Usually we get a location box open up asking where we would like to drop the attachments, but it does nothing, You click Save All Attachments and nothing happens. After hours of research I have come across mixed results, a lot of people on forums have been explaining that they have recently crossed over to Exchange 2010 and there issues started there. But on the other hand Microsoft released a KB (278188) which was depressing if that, but that article was published in 2007, as stated by the time stamp, and Exchange 2010 has only come out recently. Im looking to see if you guys have any clues what could be causing this, anything server side that I can take a look at (AD, Exchange, ...). Any help on this is greatly supported

    Read the article

  • IIS 7, FastCGI, PHP and custom php.ini files

    - by Marlon
    I'm running PHP 5.3, FastCGI, and IIS 7 on Windows Server 2008. I have a site which I would like to configure its own php.ini settings for but things aren't working as expected. I am following the tutorial located here. This is what I have done so far: 1) Configured a new website with it's own AppPool. 2) Selected PHP 5.3.6 from the PHP Manager available on the website home on IIS (not the web server home which sets the global version of PHP) 3) Added the following lines to the section of the applicationHost.config file located at system32/inetsrv/config <application fullPath="C:\Program Files (x86)\PHP\v5.3\php-cgi.exe" arguments="-d open_basedir=C:\inetpub\wwwroot\kickasswebsite.com" maxInstances="4" idleTimeout="300" activityTimeout="30" requestTimeout="90" instanceMaxRequests="200" protocol="NamedPipe" queueLength="1000" flushNamedPipe="false" rapidFailsPerMinute="10"> <environmentVariables> <environmentVariable name="PHPRC" value="c:\inetpub\wwwroot\kickasswebsite.com" /> </environmentVariables> </application> 4) I then create a php.ini file located in C:\inetpub\wwwroot\kickasswebsite.com (the location of the root of the website) register_globals = on 5) I then run test.php which simply outputs everything the method call to phpinfo() returns. At this point, I observe that the global setting for register_globals = off (as it should be), but the local setting for register_globals = off, even though I specified it differently in the php.ini file I created at the root of the site. Furthermore, I see these settings in the output of the php.ini Configuration File (php.ini) Path C:\Windows Loaded Configuration File C:\Program Files (x86)\PHP\v5.3\php.ini Scan this dir for additional .ini files (none) Additional .ini files parsed (none) What am I messing up on, or is there a different way to go about this?

    Read the article

  • Super slow opening my downloads folder

    - by Mark
    I have an exe file in my download folder that I half downloaded through utorrent (it's not piracy, a legit file from people who use bittorrent to distribute large files). I think I tried to open it while it was still sharing, that is, did not stop the upload. That actually froze my computer. When I restart in utorrent I set the file to be deleted. Unfortunately even though utorrent doesn't see that file anymore, it's still visible in my download folder. Whenever I try to open my download folder it literally takes 10 minutes or more. It opens, but is empty and the blue progress bar needs a long time to complete. After completion I can use the download folder normally, but opening and closing things in that folder takes a long time. I see the exe that I tried to download. I tried to delete it. But it was taking so long 30+ minutes that I eventually just hit cancel. That doesn't even work, and it was slowing down the computer. Couldn't figure out how to stop the delete so I just pulled the plug. Should I just forget about that dl folder and set a new one? Is there something I can do? Thanks.

    Read the article

  • How can I use wildcards in an Nginx map directive?

    - by Ian Clelland
    I am trying to use Nginx to served cached files produced by a web application, and have spotted a potential problem; that the url-space is wide, and will exceed the Ext3 limit of 32000 subdirectories. I would like to break up the subdirectories, making, say, a two-level filesystem cache. So, where I am currently caching a file at /var/cache/www/arbitrary_directory_name/index.html I would store that instead at something like /var/cache/www/a/r/arbitrary_directory_name/index.html My trouble is that I can't get try_files, or even rewrite to make that mapping. My searching on the subject leads me to believe that I need to do something like this (heavily abbreviated): http { map $request_uri $prefix { /aa* a/a; /ab* a/b; /ac* a/c; ... /zz* z/z; } location / { try_files /var/cache/www/$prefix/$request_uri/index.html @fallback; # or # if (-f /var/cache/www/$prefix/$request_uri/index.html) { # rewrite ^(.*)$ /var/cache/www/$prefix/$1/index.html; # } } } But I can't get the /aa* pattern to match the incoming uri. Without the *, it will match an exact uri, but I can't get it to match just the first two characters. The Nginx documentation suggests that wildcards should be allowed, but I can't see a way to get them to work. Is there a way to do this? Am I missing something simple? Or am I going about this the wrong way?

    Read the article

  • How to take search query and append modifers to the end of it

    - by Kimber
    This is a greasemonkey question. What I'm trying to do is modify an old google discussions script. What were wanting to do is be able to take the google search query and add modifiers to the end of it. Like this: search query: "superuser" modifiers: inurl:greasemonkey+question end result: "superuser" inurl:greasemonkey+question The old script creates a new div within the "hdtb_more_mn" element which is where you get the new discussions tab. However, since the "tbm=dsc" option to do a discussion search has died, this script no longer works. Hence the need to add modifiers to your searches. I tried to edit the script, but it appends the modifiers to the end of the url which includes "&client=firefox-a&hs=8uS&rls=org.mozilla:en-US:official". This means you're also searching for the above as well as your query, which doesn't work. I would like to be able to append the modifiers @ the end of the search querty, rather than the whole URL. I'm just not sure how to code it to where it adds the below "&tbm=" stuff within "discussionDiv.innerHTML" to the end of the query. The google search id seems to be, "gbqfq" for the search box, but I'm not sure how to add this id. Here is the old script // ==UserScript== // @name Add Back Google Discussions // @version 1.4 // @description Adds back the Discussion filters to Google Search // @include *://*.google.tld/search* // ==/UserScript== var url = location.href; if (url.indexOf('tbm=dsc') < 0) addFilterType('dsc', 'Discussions'); function addFilterType(val, name) { var searchType = document.getElementById('hdtb_more_mn'); var discussionDiv = document.createElement('DIV'); discussionDiv.className = 'hdtb_mitem'; discussionDiv.innerHTML = '<a class="q qs" href="'+ (url.replace(/&tbm=[^&]*/g,'') + '&tbm=' + val) +'">'+name+'</a>'; searchType.innerHTML += discussionDiv.outerHTML; } Thanks for any help, or suggestions on who to ask. Google Chrome has an extension for discussion searches, but FF doesn't seem to have one as of yet, which is why I'm trying to modify the above.

    Read the article

  • How to include worksheet 3 and 4 in a cell formula provided?

    - by user21255
    I have been kindly given this formula with an explanation on how it works: Insert this formula into the cell B4 of the sheet "Cases": =IF(NOT(ISBLANK('1st'!B25)),'1st'!B25,IF(NOT(ISBLANK(INDIRECT("'2nd'!R" & (ROW($B4)-(COUNTA('1st'!$B:$B)-COUNTA('1st'!$B$1:$B$24))-4+25) & "C" & COLUMN(B4),FALSE))),INDIRECT("'2nd'!R" & (ROW($B4)-(COUNTA('1st'!$B:$B)-COUNTA('1st'!$B$1:$B$24))-4+25) & "C" & COLUMN(B4),FALSE),"")) Copy the formula to the other cells in the worksheet; the relative addresses will adjust automatically. The formula works like this: Check if there is content in 1st. If yes, copy it. If no, find out how many entries there are in 1st in total. (This is done by using the COUNTA function on the whole B column in 1st and subtracting the number of non-empty cells above the actual case data.) Use this information together with the current cells's number to find out the location of the cell that has to be copied from 2nd. Create the address of the cell and use the ISBLANK function on the INDIRECT function with that address to check if the cell is empty. If it is not, use the INDIRECT function again to display it. If it is empty, just display an empty string. Now this works fine when I have only 2 sheets. But lets say I want to include a third and fourth sheet (name as 3rd and 4th respectively), then what and should I put the formula for this in the formula above? There are actually 31 sheets but if I know how to add 3rd and 4th sheet in the formula, then I can figure out how to do the rest. Thanks

    Read the article

  • Storing secure keys on Ubuntu web server

    - by Sencha
    I'm running Ubuntu 12.04 Precise with a DUNG (Django, Unix, Nginx & Gunicorn) environment and my app (as well as various config files) is stored in a python virtual environment inside /srv, which the www-data user has access to. The nginx & gunicorn processes are all run as www-data. My web app requires secure credentials which I am storing in an environment.sh file. This file contains various exports and is run using source before the gunicorn processes execute. My concern is the location of the environment.sh file and it's permissions. Will it be okay storing this file inside the /srv folder where the www-data has access to it? Or should it be stored and owned by root somewhere else such as /var/myapp/environment.sh? Also, regarding the www-data user, if any of my web processes (which are run as www-data) are compromised and someone gains access to them, does that mean that the user could potentially read any file on the system, even if they can't write? Including my secure keys?

    Read the article

  • Need a place to store a few bytes of meta information on storage media

    - by Jason C
    I'm working on an embedded project. I need a place to store some filesystem-independent meta information on a storage device. The device has an MSDOS partition table. The device also may have unallocated space (depending on its size) but it will be TRIMmed (and also may be blown away by new partitions in the future). I need a location on the device that is not unallocated and that has a low risk of being touched (outside of completely erasing the device). The device is only guaranteed to have an MBR at the point the meta data needs to first be written; meaning there are no EBRs/VBRs present that I could use. There are 446 bytes at the very start of the device available for MBR bootstrap code. Currently my only idea is to store data at the end of this block. However, the device is bootable and I have no way of knowing if I'd be blowing away bootstrap code or not. The sector size is 512 bytes and the MBR is the first sector, I'm pretty sure (correct me if I'm wrong) that that means the second sector is available for use by partition data, so I can't use that either. Does anybody have any ideas? I need 4 bytes of space.

    Read the article

< Previous Page | 567 568 569 570 571 572 573 574 575 576 577 578  | Next Page >