Search Results

Search found 9960 results on 399 pages for 'iwork pages'.

Page 23/399 | < Previous Page | 19 20 21 22 23 24 25 26 27 28 29 30  | Next Page >

  • how to make man page not disappear on exit

    - by Alan
    ...probably a silly question but I could not beat google into telling me the answer so posting here: I got 2 machines - Slackware 13 and Fedora 11. On the slack machine, when I use man I can scroll all the way to the bottom then exit man and the info stays in my terminal window (which I find very convenient as I can read it while typing the command in question, copy-paste the options, etc.). On fedora when I close man the man page info is gone. How can I configure man (or is it the terminal?) to not remove the man page info on exit?

    Read the article

  • Problem with single quotes in man pages

    - by Peter
    When I ssh into my Debian Lenny server and open a man page, single quotes appear to be messed up. Example from the man page of apt-get: If no package matches the given expression and the expression contains one of ´.´, ´?´ or ´*´ then it is assumed to be a POSIX regular expression, and it is applied to all package names in the database. Any matches are then installed (or removed). Note that matching is done by substring so ´lo.*´ matches ´how-lo´ and ´lowest´. If this is undesired, anchor the regular expression with a ´^´ or ´$´ character, or create a more specific regular expression. I'm on Mac OS X and using xterm. If I use Terminal, the problem doesn't happen. My locale is configured correctly as far as I can see: $ locale LANG=en_US.UTF-8 LC_CTYPE="en_US.UTF-8" LC_NUMERIC="en_US.UTF-8" LC_TIME="en_US.UTF-8" LC_COLLATE="en_US.UTF-8" LC_MONETARY="en_US.UTF-8" LC_MESSAGES="en_US.UTF-8" LC_PAPER="en_US.UTF-8" LC_NAME="en_US.UTF-8" LC_ADDRESS="en_US.UTF-8" LC_TELEPHONE="en_US.UTF-8" LC_MEASUREMENT="en_US.UTF-8" LC_IDENTIFICATION="en_US.UTF-8" LC_ALL= I'm not sure what's wrong with my environment, and I have no idea what to check next. I'd appreciate help.

    Read the article

  • can't open web pages in google chrome

    - by javapowered
    When I type in google chrome "google.com" I receive such error: 404. That’s an error. The requested URL /cgi-bin/authme?s=0.2068704296834766 was not found on this server. That’s all we know. It seems I'm redirected somehow by some reason. Probably lenovo notebook problem? At the same time IE works fine. This problem reproducable with other webpages too, for example i'm redirected from vkontakte.ru to such web page which doesn't exist too http://www.vkontakte.ru/cgi-bin/authme?s=0.7140683813486247 Also I see such message when chrome starts "Lütfen yönlendirilirken bekleyin", this is very strange because I only have russian and english languages installed in system.

    Read the article

  • Photoshop - How to create master/detail pages?

    - by worchyld
    I am skinning up a website and other things in Photoshop but I am finding the PSD taking a very long time to load/save and I'm wanting to split the PSD file into several files. I've done this, but sometimes I need to adjust one thing and I have to change every PSD file so that the entire project is consistent. Ideally, I'd like to be able to create a master/detail (or a template) structure using Photoshop. What is the best way to achieve this goal? Thanks

    Read the article

  • Chrome "New Tab Page" not showing most frequently visited pages

    - by Ian Boyd
    Google Chrome's*New Tab Page* is not populating the most frequented visted sites grid with anything: It's been sitting like that1 for months. My work machine populates them fine. Edit: Google Chrome Version 4.0.249.892 Edit 2: (Possibly related) Chrome is not storing any history 1i even tried clicking Restore all removed thumbnails 2Updated to 4.0.249.89 just now. Previous build was 78.

    Read the article

  • PHP on IIS7 not showing pages

    - by Jeff
    I have a PHP website on a Windows 7 machine I'm working with and it cannot be viewed by any browser - IE, Chrome, Firefox. When navigating to the root of the website (default index.php) the browser reports it cannot find the address. Not a 404 error from the webserver, just as if it cannot resolve the name. Other websites in the same default web application that are also PHP work perfectly. I've aligned all folder permissions and everything else but this has got me stumped. I even went as far to create a new folder and throw in a test phpinfo() page and it worked. Copied this website's content to the new folder and it cannot find the index.php page. I checked all setting I know and can't seem to find what I'm missing. Anyone else encounter this issue? Remember the fix for it?

    Read the article

  • the right options to traverse/download the pages/directories of a subdomain

    - by Lorraine Bernard
    Let's suppose exist a site with the following directories (subdomain) index.php |-sub1 |-index.php |-sub1sub1 |-index.php |-other.php |-sub1sub1sub1 |-sub2 |-index.php |- …. |-sub3 |- ... My question is: 1) how can I display properly locally the site of the sub1 subdomain (http://domain/sub1) 2) how can I get just the files and directory which are childs of sub1 (sub1sub1 and sub1sub1sub1 for example) I tried the following options (for wget) but it retrieves also the files and directories which are in sub2, sub3 etc.. wget -E -H -k -K -r http://domain/sub1/index.php

    Read the article

  • Fonts on Certain Web Pages jagged and blocky on XP yet fine in Windows7

    - by Peter Nimmo
    This site for example: http://www.cultbox.co.uk/reviews/episodes/778-twenty-twelve-episode-3-review Which has this CSS - http://www.cultbox.co.uk/style.css body { text-align: center; font-size: 1em; font-family: Helvetica Neue,Helvetica,Arial,sans-serif; color: #000000; background: #D3D8E1; } I think I first saw this problem on Tech Republic. Also is it possible to find out which Font the browser chose to render in?

    Read the article

  • Tool to organize and arrange various monitoring pages?

    - by PhilAG
    we recently added a MacPro with eight (yes 8) monitors on it. We have various tools for monitoring our website (Chartbeat, Nagios, internal statistics, Jenkins, Smartfox, etc.) and they are currently free-flowing in various browser windows on the various screens. I'd like a better way to organize them into a more fixed system so (a) we can't just accidentally close out of a window (b) some automatically refresh (currently done through browser plugins) etc. Any suggestions?

    Read the article

  • Redirecting a single request to another pages, ignoring www subdomain

    - by Petter Brodin
    I have a site running on IIS 7.5 that does an automatic redirect from 'http://mysite.com/whatever.aspx' to 'http://www.mysite.com/whatever.aspx' On the site, there is a lot of traffic to an old URL that I want to redirect to the front page, index.aspx: 'http://mysite.com/foo/bar/index.cgi%something=asdf&somethingelse=qwerty' The problem is that no matter what I try, I can only get the redirect to work with the www subdomain. If I use the URL without www, I just end up at 'http://www.mysite.com/404.aspx' Any ideas? Thanks in advance for all help! Edit3: it seems like the browser caching the redirect response was messing with me, so edit2 is wrong. See my response below. Edit2: disregard edit1, it doesn't seem like it's working after all. Edit: here's some further info: using this article I've managed to redirect from 'http://mysite.com/foo/bar/index.cgi' to 'http://www.mysite.com/index.aspx', but if I add the query string parameters, it still redirects to 'http://www.mysite.com/404.aspx' Isn't there a way to catch all requests to the cgi file, including query string parameters?

    Read the article

  • Nginx + Wordpress Multisite 3.4.2 + subdirectories + static pages and permalinks

    - by UrkoM
    I am trying to setup Wordpress Multisite, using subdirectories, with Nginx, php5-fpm, APC, and Batcache. As many other people, I am getting stuck in the rewrite rules for permalinks. I have followed these two guides, which seem to be as official as you can get: http://evansolomon.me/notes/faster-wordpress-multisite-nginx-batcache/ http://codex.wordpress.org/Nginx#WordPress_Multisite_Subdirectory_rules It is partially working: http://blog.ssis.edu.vn works. http://blog.ssis.edu.vn/umasse/ works. But other permalinks, like these two to a post or to a static page, don't work: http://blog.ssis.edu.vn/umasse/2008/12/12/hello-world-2/ http://blog.ssis.edu.vn/umasse/sample-page/ They either take you to a 404 error, or to some other blog! Here is my configuration: server { listen 80 default_server; server_name blog.ssis.edu.vn; root /var/www; access_log /var/log/nginx/blog-access.log; error_log /var/log/nginx/blog-error.log; location / { index index.php; try_files $uri $uri/ /index.php?$args; } # Add trailing slash to */wp-admin requests. rewrite /wp-admin$ $scheme://$host$uri/ permanent; # Add trailing slash to */username requests rewrite ^/[_0-9a-zA-Z-]+$ $scheme://$host$uri/ permanent; # Directives to send expires headers and turn off 404 error logging. location ~* \.(js|css|png|jpg|jpeg|gif|ico)$ { expires 24h; log_not_found off; } # this prevents hidden files (beginning with a period) from being served location ~ /\. { access_log off; log_not_found off; deny all; } # Pass uploaded files to wp-includes/ms-files.php. rewrite /files/$ /index.php last; if ($uri !~ wp-content/plugins) { rewrite /files/(.+)$ /wp-includes/ms-files.php?file=$1 last; } # Rewrite multisite '.../wp-.*' and '.../*.php'. if (!-e $request_filename) { rewrite ^/[_0-9a-zA-Z-]+(/wp-.*) $1 last; rewrite ^/[_0-9a-zA-Z-]+.*(/wp-admin/.*\.php)$ $1 last; rewrite ^/[_0-9a-zA-Z-]+(/.*\.php)$ $1 last; } location ~ \.php$ { # Forbid PHP on upload dirs if ($uri ~ "uploads") { return 403; } client_max_body_size 25M; try_files $uri =404; fastcgi_pass unix:/var/run/php5-fpm.sock; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; include /etc/nginx/fastcgi_params; } } Any ideas are welcome! Have I done something wrong? I have disabled Batcache to see if it makes any difference, but still no go.

    Read the article

  • Tool to edit the online pages

    - by doug
    hi there When i read something I need to underscore what i like, or what i think is important, and to take notices about that read, near on that current paragraph. Soo, does anyone know a firefox addon or something else(other browser, any other application) which can provide me such a functionality? ps: I've tried some research tools as zotero, but it is not what I'm looking for.

    Read the article

  • Redirecting pages from the root folder to a subfolder

    - by MarcoPRT
    I have a Joomla site in the root directory of my domain, and I have a forum at /forum subdirectory. How can I redirect visitors from the main site to the forum, continuing to have the possibility to access the site from a link at the forum? Example: http://example.com redirected to http://example.com/forum , but I can access the main site by the link http://example.com/index.php

    Read the article

  • What causes PHP pages to consistently download instead of running normally

    - by Jonathan
    Hi, I'm running a Ubuntu Server on a VM, to test out different web forum solutions. I have set up a ~/public_html/ to be accessible with the apache2 web server, and that works fine. However when I go to a .php file on a browser (using my VM's ip-address/~username/phpfile.php) it does not display it as it should. Instead it offers to save to file/asks what program to open it with. Interestingly though that dialog box does recognise that it is a php file. I have the following version of php installed on the system: PHP 5.3.2-1ubuntu4.5 with Suhosin-Patch (cli) (built: Sep 17 2010 13:49:46) Copyright (c) 1997-2009 The PHP Group Zend Engine v2.3.0, Copyright (c) 1998-2010 Zend Technologies And the following server: Server version: Apache/2.2.14 (Ubuntu) Server built: Nov 18 2010 21:19:09 If anyone knows what might be causing this/potential solutions it would make me very happy :) EDIT: Turns out files this behaviour was only apparent on files in the ~/public_html/ directory. All php files in /var/www/ work fine. Prizes go to whoever can explain why? :D (And by prizes I just mean a well done, no actual prizes I'm afraid.)

    Read the article

  • duplicate cache pages: Varnish

    - by Sukhjinder Singh
    Recently we have configured Varnish on our server, it was successfully setup but we noticed that if we open any page in multiple browsers, the Varnish send request to Apache not matter page is cached or not. If we refresh twice on each browser it creates duplicate copies of the same page. What exactly should happen: If any page is cached by Varnish, the subsequent request should be served from Varnish itself when we are opening the same page in browser OR we are opening that page from different IP address. Following is my default.vcl file backend default { .host = "127.0.0.1"; .port = "80"; } sub vcl_recv { if( req.url ~ "^/search/.*$") { }else { set req.url = regsub(req.url, "\?.*", ""); } if (req.restarts == 0) { if (req.http.x-forwarded-for) { set req.http.X-Forwarded-For = req.http.X-Forwarded-For + ", " + client.ip; } else { set req.http.X-Forwarded-For = client.ip; } } if (!req.backend.healthy) { unset req.http.Cookie; } set req.grace = 6h; if (req.url ~ "^/status\.php$" || req.url ~ "^/update\.php$" || req.url ~ "^/admin$" || req.url ~ "^/admin/.*$" || req.url ~ "^/flag/.*$" || req.url ~ "^.*/ajax/.*$" || req.url ~ "^.*/ahah/.*$") { return (pass); } if (req.url ~ "(?i)\.(pdf|asc|dat|txt|doc|xls|ppt|tgz|csv|png|gif|jpeg|jpg|ico|swf|css|js)(\?.*)?$") { unset req.http.Cookie; } if (req.http.Cookie) { set req.http.Cookie = ";" + req.http.Cookie; set req.http.Cookie = regsuball(req.http.Cookie, "; +", ";"); set req.http.Cookie = regsuball(req.http.Cookie, ";(SESS[a-z0-9]+|SSESS[a-z0-9]+|NO_CACHE)=", "; \1="); set req.http.Cookie = regsuball(req.http.Cookie, ";[^ ][^;]*", ""); set req.http.Cookie = regsuball(req.http.Cookie, "^[; ]+|[; ]+$", ""); if (req.http.Cookie == "") { unset req.http.Cookie; } else { return (pass); } } if (req.request != "GET" && req.request != "HEAD" && req.request != "PUT" && req.request != "POST" && req.request != "TRACE" && req.request != "OPTIONS" && req.request != "DELETE") {return(pipe);} /* Non-RFC2616 or CONNECT which is weird. */ if (req.request != "GET" && req.request != "HEAD") { return (pass); } if (req.http.Accept-Encoding) { if (req.url ~ "\.(jpg|png|gif|gz|tgz|bz2|tbz|mp3|ogg)$") { # No point in compressing these remove req.http.Accept-Encoding; } else if (req.http.Accept-Encoding ~ "gzip") { set req.http.Accept-Encoding = "gzip"; } else if (req.http.Accept-Encoding ~ "deflate") { set req.http.Accept-Encoding = "deflate"; } else { # unknown algorithm remove req.http.Accept-Encoding; } } return (lookup); } sub vcl_deliver { if (obj.hits > 0) { set resp.http.X-Varnish-Cache = "HIT"; } else { set resp.http.X-Varnish-Cache = "MISS"; } } sub vcl_fetch { if (beresp.status == 404 || beresp.status == 301 || beresp.status == 500) { set beresp.ttl = 10m; } if (req.url ~ "(?i)\.(pdf|asc|dat|txt|doc|xls|ppt|tgz|csv|png|gif|jpeg|jpg|ico|swf|css|js)(\?.*)?$") { unset beresp.http.set-cookie; } set beresp.grace = 6h; } sub vcl_hash { hash_data(req.url); if (req.http.host) { hash_data(req.http.host); } else { hash_data(server.ip); } return (hash); } sub vcl_pipe { set req.http.connection = "close"; } sub vcl_hit { if (req.request == "PURGE") {ban_url(req.url); error 200 "Purged";} if (!obj.ttl > 0s) {return(pass);} } sub vcl_miss { if (req.request == "PURGE") {error 200 "Not in cache";} }

    Read the article

  • Best way to use mod_rewrite to replace WordPress pages with static files

    - by David Moles
    Here's the situation: I've got an old WordPress installation that I'd like to archive as static files, but I'd also like to preserve old URLs. I've already created the static archive with wget and sorted out the filenames and links. Now I'd like to configure Apache to intercept requests for the old dynamic URL and replace them with the new static one, e.g.: http://www.example.org/log/?p=1234 or http://www.example.org/log/index.php?p=1234 should redirect to http://www.example.org/log/archives/1234.html I've tried adding the following to the VirtualHost config for example.org, but to no effect -- I just get the PHP page. RewriteCond %{REQUEST_URI} /log/ RewriteCond %{QUERY_STRING} p=([^&;]*) RewriteRule ^/$ http://%{SERVER_NAME}/log/archives/%1.html [R,L] I've enabled logging and I can see what look like other rules being applied, but not this one. None of my other guesses at match patterns for %{REQUEST_URI} seem to have any effect either (log, log/, log.*, even .*). I'm new to mod_rewrite and this is mostly cargo cult, so I'm pretty sure I've gotten it wrong. Anyone know what I should be doing here?

    Read the article

  • Httpd and LDAP Authentication not working for sub-pages

    - by DavisTasar
    I just recently installed a Nagios implementation, and I'm trying to get LDAP authentication working for httpd on Red Hat. (nagios.conf for Apache config below, sanitized of course) ScriptAlias /nagios/cgi-bin "/usr/local/nagios/sbin" <Directory "/usr/local/nagios/sbin"> #SSLRequireSSL Options ExecCGI AllowOverride none AuthType Basic AuthName "LDAP Authentication" AuthLDAPURL "ldap://my.domain.controller:389/OU=Users,DC=my,DC=domain,DC=controller?sAMAccountName?sub?(objectClass=user)" NONE AuthzLDAPAuthoritative off AuthLDAPBindDN "CN=NagiosAdmin,DC=my,DC=domain,DC=controller" AuthLDAPBindPassword "myPassword" require valid-user </Directory> Alias /nagios "/usr/local/nagios/share" <Directory /usr/local/nagios/share> #SSLRequireSSL Options None AllowOverride none AuthBasicProvider ldap AuthType Basic AuthName "LDAP Authentication" AuthzLDAPAuthoritative off AuthLDAPURL "ldap://my.domain.controller:389/OU=Users,DC=my,DC=domain,DC=controller?sAMAccountName?sub?(objectClass=user)" NONE AuthLDAPBindDN "CN=NagiosAdmin,DC=my,DC=domain,DC=controller" AuthLDAPBindPassword "myPassword" require valid-user </Directory> Now, the initial authentication works, so when you first hit the page you can log in just fine. However, when you go anywhere else, it prompts you for authentication, fails (asking for a re-prompt), and gives this error message: [Mon Oct 21 14:46:23 2013] [error] [client 172.28.9.30] access to /nagios/cgi-bin/statusmap.cgi failed, reason: verification of user id '<myuseraccount>' not configured, referer: http://<nagiosserver>/nagios/side.php I'm almost certain its a simple flag or option, but I just can't find it, and I don't have a lot of experience working with Apache. Any assistance or help would be greatly appreciated.

    Read the article

  • Unauthorized access error to html pages in IIS 7.0

    - by George2
    I am using VSTS 2008 + C# + .Net 3.5 + IIS 7.0. I have created a new web site and put an html file into the directory. And when I use browse function in IIS manager to browse the html file, I met with the following error, any ideas what is wrong? BTW: I am very confused about unauthorized error since I run the worker process under administrator account. From the error message, I am confused why the logon method is anonymous and not using administrator account? HTTP Error 401.3 - Unauthorized You do not have permission to view this directory or page because of the access control list (ACL) configuration or encryption settings for this resource on the Web server. Module IIS Web Core Notification AuthenticateRequest Handler StaticFile Error Code 0x80070005 Requested URL http://localhost:80/a.html Physical Path C:\test\simplehosttest\a.html Logon Method Anonymous Logon User Anonymous thanks in advance, George

    Read the article

  • Dns works, can ping, but cannot load web pages in browser

    - by user1224595
    Yesterday I changed routers, and my desktop computer started acting up. I could ping websites, and nslookup was able to resolve names to addresses, but neither chrome, firefox, nor ie could load any webpages. None of my other computers connected to the same wireless router have any problems. I connect my desktop to the router through a cheap wifi dongle. I did a wireshark capture of the browser request, and I have uploaded the pcap here. https://drive.google.com/file/d/0B7AsPdhWc-SwbTV0bUJLQXo4UUE/edit?usp=sharing One strange thing I noticed was the spamming of ssdp packets. I am not super familiar with networking, but it seems that it is not a problem with the router, as dns works, and so does dhcp (the desktop is assigned an address correctly). Any help would be appreciated.

    Read the article

  • Route global shortcuts to pages opened in Firefox

    - by zamza
    I like listening to music online on sites like stereomood.com. There is a major problem, however. I can not control the player with my keyboard. And even with a mouse - when I want to play/pause I must activate firefox window, select the tab where music plays and hit play/pause button manually - this is a pain, especially when you play a fullscreen game that can not minimize itself. Thus said, global keyboard shortcuts would be a perfect solution. I understand that different online media players have different controls and each site must be configured individually (like, select button with id 'play' and press it), but I believe that can be done in principle. I also guess that such tricks are impossible without some third-party native app which captures shortcuts and routes them to Firefox window. So, any solutions? Maybe some AutoHotkey hacks or similar.

    Read the article

  • How can I remove unwanted cropped pages from Acrobat

    - by Servant
    Executing the crop command in Acrobat from a 3000pt * 2000pt document to 1500pt*1800pt only hides the document outside of the new boundaries but the original document still remains without change; if anyone uses the touch-up tool and moves the content, all "hidden" information outside the cropped page may appear again by dragging it into view. The page acting as a window (or a mask) to display the 3000pt * 2000pt. I am wondering if there is a solution to crop permanently the document without reprinting it again into PDF file? Please find pictures attached: http://i.stack.imgur.com/5JTPg.png http://i.stack.imgur.com/HPokv.png

    Read the article

  • Good speedtest results, but web pages don't load

    - by dmt0
    I have strange connection problems. Ping and download times are good - speedtest.net showed ping 65ms and download 2.17Mbps. Torrent is working well, giving me up to 300MBps. Webpages are loading very poorly though. They are timing out each time - I'd have to refresh 4-5 times to get any simple page to load. It has been happening consistently for the last few days. Same with different browsers on different machines (same network), Windows and Linux. There is no proxy in the browser. Is there any setting in Windows or in a browser that I can change to help this? Some background: I live on this island in Thailand, where internet connection is through radio to another island and than to mainland - it's very weather dependent, but generally OK. As I mentioned, ping is good. Any input is very appreciated.

    Read the article

< Previous Page | 19 20 21 22 23 24 25 26 27 28 29 30  | Next Page >