Search Results

Search found 9410 results on 377 pages for 'special folders'.

Page 214/377 | < Previous Page | 210 211 212 213 214 215 216 217 218 219 220 221  | Next Page >

  • Is there a way to customize items on the right side of start menu in Windows 7?

    - by Basara
    OK I know I can right click on it and select properties... But it only allows me to decide whether an item is shown or hidden. What I want is, adjust item position on the start menu right side. More specifically, I want to put the item "games" in the same group where personal folders are. It makes perfectly no sense to put "games" with "computer" and "network" in same group. Or is there a registry tweak can make this possible? Thanks.

    Read the article

  • Upgraded to Mountain Lion, now 127.0.0.1 is not resolving

    - by Shanimal
    I used to be able to type 127.0.0.1 (or my network IP 10.10.53.32) and it would resolve to my "default" virtual host. 127.0.0.1/~Shanimal and shanimal.dev both resolve to their appropriate folders. localhost and 127.0.0.1 give me a 404 - "Not Found The requested URL / was not found on this server." Basically, my "It works!" screen no longer works. /private/etc/apache2/Shanimal.conf: <Directory "/Users/Shanimal/Sites/_www"> Options Indexes Multiviews AllowOverride AuthConfig Limit Order allow,deny Allow from all </Directory> hosts: 127.0.0.1 localhost 127.0.0.1 shanimal.dev

    Read the article

  • How does Google searches for content? [closed]

    - by Akito
    I am trying to understand how does Google search for content within a page. When we search a page, it displays relevant results with keywords in the title or other important places. The thing that astonishes me is that how do they grab the starting area of the text? They show a small text with the search results. How do they manage to have it as there is nothing special in a webpage that makes a Google bot understand from where does the actual content start. Please help me out. Thanks

    Read the article

  • Mac OS X multi-user thin client server (terminal server)?

    - by username
    Is there any solution out there to turn a Mac into a true multi-user thin client server? I'd like to set up a few cheap PCs with access to a couple accounts using something like VNC, but it isn't economical to buy a new server for each user or a new license for virtualized OS X Server for each user. I'm fully aware that OS X Server lets you set up users with "network home folders," and I know there's also VNC built into Mac OS X. Neither of these fit the bill (the former requires a thick client, and the latter is single-user only) UPDATE: yay, Lion! http://www.9to5mac.com/54102/10-7-lion-allows-multi-user-remote-computing

    Read the article

  • Best archive format & tool for large amounts of data (50gb+)

    - by marcusstarnes
    I only realised this afternoon that the ZIP format has a limit of what appears to be around 20gb. I am trying to automate an archive process (using Automate) to zip/rar/whatever a collection of folders/files on one of my disks. It always appeared to bomb out with an incomplete archive at about 20gb. So I tried using WinRAR and doing it manually as a ZIP file, but it told me of the limit. So, I was wondering, what is a recommended zip format (and tool for accomplishing the task) for archiving up a large amount of data (around 50gb)? Thanks

    Read the article

  • How to "get" a reliable parallel port on a laptop without a PCMCIA slot?

    - by ldigas
    Usb-Parallel port (for an old, but reliable matrix printer that has its special use) connections (cables) are unreliable. They sometimes work, sometimes don't - and since I installed Windows 7 I can't get neither of my old ones to work properly. PCMCIA is usually considered (and it is) a much more reliable solution, but unfortunatelly, none of my new laptops has a PCMCIA slot. So, all ideas are welcomed. What should I do? I'm open to all suggestions as long as you have some experience that they work more reliably than USB-Parallel cables and their wicked drivers.

    Read the article

  • Why can't my Perl script in ~/bin find relative file paths?

    - by sid_com
    #!/usr/bin/env perl use warnings; use strict; use XML::LibXML; my $parser = XML::LibXML->new; my $file = './example.xml'; my $doc = $parser->parse_file( $file ); print ref( $doc ), "\n"; When I move this script and the example.xml-file to /home/me/ then the script works. When I move the script and the example.xml-file to /home/me/bin/ then the script doesn't find the example.xml-file. Is this some special-feature of the bin-directory?

    Read the article

  • Automate hashing for each file in a folder?

    - by Kennie R.
    I have quite a few FTP folders, and I add a few each month and prefer to leave some sort of method of verifying their integrity, for example the files MD5SUMS, SHA256SUMS, ... which I could create using a script. Take for example: find ./ -type f -exec md5sum $1 {} \; This works fine, but when I run it each time for each shaxxx sum afterwards, it creates a sum of the MD5SUMs file which is really not wanted. Is there a simpler way, or script, or common way of hashing all the files in to their sums file without causing problems like that? I could really use a better option.

    Read the article

  • How do I run perfmon against a remote machine?

    - by WindyCityEagle
    For starters I'm trying to do this from a Windows 7 box to a Windows 7 box. I know the simple answer should be to enter \\computername where it says "Select Counters From Computer". Trouble is, every time I do that I get an error message that says "Unable to connect to machine" I know the machine is out there and accessible, because I can bring up windows explorer and enter \\computername\c$ in the explorer bar and then can browse their c drive. So the machine is out there. Are there special permissions for perfmon? Is here a service which needs to be running on the remote machine?

    Read the article

  • Use test to check for condition with find and execdir option

    - by slosd
    I think I can keep my question short. Why does the following command produce no output? find /usr/share/themes -mindepth 1 -maxdepth 1 -type d -execdir test -d {}/gnome-shell \; I expected it to print all folders in /usr/share/themes that contain a folder gnome-shell. Several websites suggest that this usage of test as a command in exec/execdir is possible. From man find: -exec command ; Execute command; true if 0 status is returned. [...]

    Read the article

  • Hard drive self monitoring system

    - by Hoorayo
    I have a 500GB HDD on my desktop, and there are two partitions as C and D. The computer would not start and shows me a error message. Notice - Hard drive self monitoring system has reported that a parameter has exceeded its normal operating range. Dell recommends you that you back up your data regularly. A parameter out of range may or may not indicate a potential hard. So I took the hdd out of the desktop and made a USB external HDD. My laptop recognizes the hard drive as “I” drive and “J” drive. I am able to click “J” and see folders and files. But there is no response when I click “I” while it makes weird clicking sound. Can anyone explain why I drive doesn't work while J drive works on same physical hard drive? Is there anyway that I can fix?

    Read the article

  • Shortcut with arguments in Debian

    - by Duncan
    I have a volume on a debian server which contains a large number of images at full resolution in various folders. What I'd like to do is have a separate sort of browse proxy folder which contains lower quality browse copies of these to enable users to access them for viewing over lower speed dial in accounts. I'd ideally like these to be created on the fly using ImageMagick so there isnt the need to store the large number of browse copies full time and worry about keeping them up to date etc The way I'd invisaged this happening is the browse proxy folder containing a duplicate file and folder structure but with symlinks pointing to a script to transform them with the file path as an argument. Except I know this isnt possible with symlinks so am wondering if there's another way of doing this on linux. On windows shortcuts can take arguments and I'm wondering how to do the same on a Linux platform? (or perhaps I'm going about this the wrong way?)

    Read the article

  • Make Chrome always open PDFs itself

    - by jdm
    Hi, I'm looking for a way to make Google Chrome always open PDFs with its internal viewer when I click a link, as opposed to downloading it to the default location. It works with most URLs, but some servers set a special header to force the file to be downloaded ("Content-Disposition: attachment;", e.g. http://www.uni-goettingen.de/en/46260.html). What I want is the opposite of this question: Stop PDFs from displaying inside Google Chrome, or what is asked for here, but applied to Chrome: How to ignore “Content-Disposition: attachment” in Firefox Btw., I'm running Chrome 8.0.552.0 dev on Ubuntu 10.4.

    Read the article

  • What does the NTFS encryption protect against?

    - by Ray
    I have encrypted a folder from the (PropertiesAdvancedEncrypt contents to secure data). However when I change my user profile to another one which is also an administrator the folder seems to be accessible as if nothing happened. What exactly does this encryption protect against. I'm looking to encrypt folders that no other user, or another OS or even if the HDD were to be removed and plugged to another device will be accessible. My OS is Windows 7 Ultimate. Any suggestions?

    Read the article

  • What does the red x icon mean next to a user in folder permissions (Windows 7)

    - by Scott Szretter
    In trying to debug various strange issues on a machine, I found something strange - when I go to C:\Users\administrator and get properties, security tab, it lists the users (the local admin account, system, and 'administrator' which is the domain administrator account). It all looks fine in terms of permissions (full control, etc.) compared to other machines. The one difference is there is a small red circle with an X to the left of the user icon/name. Additionally, there are various folders where it says access denied under there - for example, my documents! Even logged in as the local machine administrator account (which is not named administrator), I am unable to change the permissions - it says access denied. Any ideas what this means and how to fix it? I even tried re-joining the machine to the domain.

    Read the article

  • Running JBoss 6 with Runit / daemontools or other process supervision framework

    - by Alex Recarey
    I'm tying to use runit to daemonize JBoss. I use the /opt/jboss-6.1.0.Final/bin/run.sh script to start the server. When I do so from the comandline, JBoss does not detach (which is what we want), and will also shut down when CTRL+C is pressed. In theory a perfect candidate to use runit on. Everything works fine except when I try to get runit to shut down JBoss. When I issue the command sv stop jboss nothing happens. Runit thinks the process is stopped but jboss continues to run normally. I'm not doing anything special with the run script. This is my runit run script: #!/bin/sh exec 2>&1 exec /opt/jboss-6.1.0.Final/bin/run.sh -c standard -b 0.0.0.0 Looking at the jboss_init_redhat.sh script, the start section does mention ./bin/run.sh but the stop section has the following text: JBOSS_CMD_STOP=${JBOSS_CMD_STOP:-"java -classpath $JBOSSCP org.jboss.Shutdown --shutdown"} Any ideas of what I could try?

    Read the article

  • Apply rewrite rule for all but all the files (recursive) in a subdirectory?

    - by user784637
    I have an .htaccess file in the root of the website that looks like this RewriteRule ^some-blog-post-title/ http://website/read/flowers/a-new-title-for-this-post/ [R=301,L] RewriteRule ^some-blog-post-title2/ http://website/read/flowers/a-new-title-for-this-post2/ [R=301,L] <IfModule mod_rewrite.c> RewriteEngine On ## Redirects for all pages except for files in wp-content to website/read RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_URI} !/wp-content RewriteRule ^(.*)$ http://website/read/$1 [L,QSA] #RewriteRule ^http://website/read [R=301,L] RewriteBase / RewriteRule ^index\.php$ - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] </IfModule> My intent is to redirect people to the new blog post location if they propose one of those special blog posts. If that's not the case then they should be redirected to http://website.com/read. Nothing from http://website.com/wp-content/* should be redirected. So far conditions 1 and 3 are being met. How can I meet condition 2?

    Read the article

  • Enable group policy for everything but the SBS?

    - by Jerry Dodge
    I have created a new group policy to disable IPv6 on all machines. There is only the one default OU, no special configuration. However, this policy shall not apply to the SBS its self (nor the other DC at another location on a different subnet) because those machines do depend on IPv6. All the rest do not. I did see a recommendation to create a new OU and put that machine under it, but many other comments say that is extremely messy and not recommended - makes it high maintenance when it comes to changing other group policies. How can I apply this single group policy to every machine except for the domain controllers? PS - Yes, I understand IPv6 will soon be the new standard, but until then, we have no intention to implement it, and it in fact is causing us many issues when enabled.

    Read the article

  • Need ability to set configuration options using single method which will work across multiple server configurations.

    - by JMC Creative
    I'm trying to set post_max_size and upload_max_filesize in a specific directory for a web application I'm building. I've tried the following in a .htaccess file in the script directory. (upload.php is the script that needs the special configuration) <Files upload.php> php_value upload_max_filesize 9998M php_value post_max_size 9999M </Files> That doesn't work at all. I've tried it without the scriptname specificity, where the only thing in the .htaccess file is: php_value upload_max_filesize 9998M php_value post_max_size 9999M This works on my pc-based xampp server, but throws a "500 Misconfiguration Error" on my production server. I've tried also creating a php.ini file in the directory with: post_max_size = 9999M upload_max_filesize = 9998M But this also doesn't always work. And lastly using the following in the php script doesn't work either, supposedly because the settings have already been compiled by the time the parser reaches the line (?): <?php ini_set('post_max_size','9999M'); ini_set('upload_max_filesize','9998M'); ?>

    Read the article

  • Sharing files from my Macbook to Windows Desktop

    - by Vahe
    What's the most dependable way to share files and folders on a OS X to Windows (I'm using 7). I've followed this guide: http://support.apple.com/kb/HT1812 However, this method does not seem dependable (Works only some of the time). My Macbook does not always show up under the Network on Windows. On certain occasions, turning off windows file sharing completely (in OS X preferences), restarting my Macbook and turning it back on helps. Is there any reason why this is happening? Is there an alternative method?

    Read the article

  • How do I use a custom 503 error page with Nginx?

    - by Michael Gorsuch
    Hi. I have implemented rate limiting with Nginx (which works excellently, by the way) and would like to display a custom 503 error page. I have followed examples on the web without luck. I am running a simple configuration that looks something like this: listen x.x.x.x:80 server_name something.com root /usr/local/www/something.com; error_page 503 /503.html; location / { limit_req zone=default burst=5 nodelay; proxy_pass http://mybackend; } The idea is that our rate limited users would be shown a special page explaining what was going on. The rate limiting is working, but the built-in 503 page is rendering. Any ideas?

    Read the article

  • Redirect with iptables if destination port is not listened

    - by PoltoS
    I've a server listening on port 10000. But this server is running only in a special case (then some third service is available). Otherwise the port is not listened. Is it possible to redirect the client to another port if 10000 is not listening? I see two solutions: 1) insert/remove iptables rules on server start/stop, but since the server may be killed, it may not insert the correct iptable redirect rule before dieing. 2) make a permanent userspace rule that checks if the port is listening and redirects the packet if not listened. How to do 2) ? Do someone have recipes for ipq? May be someone can suggest me a better way? It is something like fallback redirect: I'll have thousand of clients with different ports (10000-11000) and if their instance of server is not running, whey should be redirected to some page explaining why they don't have and instance connected.

    Read the article

  • Symlink to /Documents /Pictures etc. in OS X Home Directory?

    - by Larry O'Brien
    I have just purchased a 120GB SSD with the intent of making it my boot drive. I'd like to keep it as lean as possible since, y'know, it's so small (Heaven help me). I've read Can I move my home folder in Mac OS X? and Moving Mac OS X user folders? which discourage moving the entire home dir to a data drive. Is it possible and less-dangerous to leave the home directory on the boot drive but move the big data directories to a slower drive and symlink to them? I have the same thoughts with the /Applications directory, but maybe I should make that a separate question?

    Read the article

  • Archive format & tool for large amounts of data (50gb+)

    - by marcusstarnes
    I only realised this afternoon that the ZIP format has a limit of what appears to be around 20gb. I am trying to automate an archive process (using Automate) to zip/rar/whatever a collection of folders/files on one of my disks. It always appeared to bomb out with an incomplete archive at about 20gb. So I tried using WinRAR and doing it manually as a ZIP file, but it told me of the limit. So, I was wondering, what is a recommended zip format (and tool for accomplishing the task) for archiving up a large amount of data (around 50gb)?

    Read the article

  • I installed XAMPP on Ubuntu, and its running without any of its files.

    - by CDeanMartin
    I installed XAMPP on Ubuntu, and its running without any of its files. Well, not really, but it sure seems that way. I followed the directions on the Apache Friends website to the letter. XAMPP works fine, it's sample apps run like a charm. But I have no idea how it is running. The installation should have created an otf folder, or a XAMPP folder, or an htdoc file according to the tutorials. But there is no trace of any of these files or folders anywhere in the file browser. Are there any linux utilities as good as the Windows Explorer that can find my missing files? I particularly need to find the var/www folder to put my .php files in.

    Read the article

< Previous Page | 210 211 212 213 214 215 216 217 218 219 220 221  | Next Page >