Search Results

Search found 25998 results on 1040 pages for 'home folder'.

Page 98/1040 | < Previous Page | 94 95 96 97 98 99 100 101 102 103 104 105  | Next Page >

  • Using my old PC as a web/file server?

    - by Garrett
    I have an old desktop computer that I've been trying to sell for AGES. I guess nobody is looking for computers because it was advertised at a dirt cheap price on craigslist, local papers, etc. Anyways, I was wondering if it would be worth it to set it up as a home file server, a web dev server (I have a web host for actual production use), and maybe host a few server applications (ex: ventrillo). The computer is actually an old Dell that I cannibalized after the motherboard being destroyed by lightning, so it has fairly new parts in it. The specs are: P4 3.4GHz w/ HT and Artic Cooling Freezer 7 3GB DDR2 533 RAM 80GB hdd (will upgrade the hard drive if it's even worth using as a server) basic dvd rom 430 Watt Thermaltake PSU (it might be important to note that it is only 60% efficiency) ATI Radeon x600 256MB Antec 300 case It's not a really beefy machine, I just can't see giving it away or putting it in the corner to just collect dust. I have Windows Server 2008 R2 Standard and I am confident in my skills in operating most Linux operating systems. I'd also be using it to tinker with when I learn new things in my server admin classes (I'm finishing my 2nd year in college at the moment so I'm still learning) Also, my house is quite old and the electrical wiring is pretty poor (it MIGHT be up to code, then again, where I live most people don't even know what regulations are or let alone know how to spell it...) Would it be safe to leave it running all day and is it going to run up my electric bill because of the PSU efficiency? I only have 5mbit cable internet, but I won't be running very bandwidth intense services on it so it should be ok. I should elaborate on why I am concerned about the power. The circuits should be fine, but I'm more concerned about fire hazard. What is the likelihood that the server could cause an electrical fire? Again, thank you all for the feedback!

    Read the article

  • How to download all file content from a folder using wget and http

    - by user1526912
    I am trying to use wget and http to download all contents from folderAA below to directory /root/sstest wget -r --directory-prefix="/root/sstest" -o /root/sstest2.log http://site.com/folder1/folder2/folderAA/ When I submit the above command nothing is downloaded. If I submit a wget request for a specific file from folderAA the file is actually downloaded to /root/sstest: wget -r --directory-prefix="/root/sstest" -o /root/sstest2.log http://site.com/folder1/folder2/folderAA/file.txt Can someone tell me why I cannot download all file content from folderAA at once using the first wget request?

    Read the article

  • Set usergroup and persmissions ftp folder back to default

    - by OrangeTux
    I tried to create a new ftp user via the commandline. But I did something wrong and now I can access the server via FTP but I can't see any files. It doesn't make any sense wich user I'm using. ls -la drwxr-xr-x 13 root ftp 4096 2012-03-30 09:47 . drwxr-xr-x 7 web6 ftp 4096 2012-03-26 09:28 .. drwxr-xr-x 4 web6 ftp 4096 2012-03-26 13:31 actions drwxr-xr-x 2 web6 ftp 4096 2012-03-26 11:46 bin -rwxr-xr-x 1 web6 ftp 1520 2012-03-24 23:32 changelog.txt drwxr-xr-x 2 web6 ftp 4096 2012-03-26 13:30 css drwxr-xr-x 8 web6 ftp 4096 2012-03-24 22:43 external -rwxr-xr-x 1 web6 ftp 333 2012-03-26 15:12 .htaccess drwxr-xr-x 3 web6 ftp 4096 2012-02-27 15:07 images -rwxr-xr-x 1 web6 ftp 1606 2012-03-26 21:25 index.php drwxr-xr-x 2 web6 ftp 4096 2012-02-18 13:20 js drwxr-xr-x 2 web6 ftp 4096 2012-02-03 00:34 layout drwxr-xr-x 2 web6 ftp 4096 2012-03-29 23:35 library drwxr-xr-x 2 web6 ftp 4096 2012-03-30 09:47 log -rwxr-xr-x 1 web6 ftp 396 2012-03-24 15:04 menu.php drwxr-xr-x 2 web6 ftp 4096 2012-03-30 12:01 python drwxr-xr-x 2 web6 ftp 4096 2012-03-23 10:51 todo I can't see any dirs and files because I changed the groupowner or I the rights of the groupowner of the ftp dir. How can I set the ownership of the files back to default so I can access the files via FTP again?

    Read the article

  • Windows 7 - Add folder to Explorer Favorites navigation pane from the Command Line

    - by nondescript1
    In Windows 7 is there a way to add a location to the Explorer Favorites navigation pane from the command line? I'm working with systems that are frequently re-imaged, and I would like to automate adding a number of favorite folders to explorer. I imagine these favorites are also stored in the registry. If someone knows where, I could probably automate managing them through the reg command, although this is less than ideal. I've looked at a number of locations related to explorer suggested here, but haven't found them yet. For information on customizing the favorites section of the navigation pane with Explorer, see http://www.howtogeek.com/howto/10357/add-your-own-folders-to-favorites-in-windows-7/

    Read the article

  • Problem Uninstalling Microsoft Internet Explorer

    - by Roger F. Gay
    On Windows Vista Home Premium (x64) I am trying to uninstall Microsoft Internet Explorer. The procedures explained all over the web involve going through the control panel to Programs and Features. If MSIE is listed there, then uninstall in the usual way. If it is not listed there, click Turn Windows features on and off and deactivate it there. But Internet Explorer is not listed in either place. Background: I initiated some process in MSIE a couple of months ago that caused all web pages to no longer save login information or remain logged in when requested. As you can tell from the way I described that, I don't remember what it was and have no way to simply reverse it. I had a few problems with .NET Framework as well. So, I've uninstalled all browsers except MSIE and uninstalled .NET Framework. I've reinstalled .NET Framework and all other browsers. I have not been able to uninstall MSIE. Have Tried: I tried installing over the existing installation, but auto-update must be keeping it nicely up to date. The attempt simply produced an information window telling me that my current version is more up-to-date than the new version I tried to install.

    Read the article

  • Sync a specific folder of contacts to iPhone

    - by colemanm
    Is there a simple way to organize contacts in Outlook/Entourage and only have a subset of them synchronize with the iPhone over Exchange ActiveSync? Our CEO has thousands of contacts in his mailbox, but would prefer if only a small portion of them synched to his phone over the air... The iPhone's performance takes a huge hit keeping that massive dataset in order. If he could put some of the records in subfolders or something and only sync the top level, I think that would work for him. Does anyone know if this is possible?

    Read the article

  • Why can't I connect to my router's config page with Windows 7?

    - by user17940
    I've got a Belkin wireless router, and just bought a new Dell computer with Windows 7 pre-installed. I can connect to the Internet and my home network just fine, but when I try to visit my router's configuration page at http://192.168.2.1, I get a "Connection was reset" error. Nothing I do will make the router's configuration page come up in my web browser. More background information: I could always get to the router's config page from my Windows XP machine. I never had any trouble prior to getting this Windows 7 computer. I can ping 192.168.2.1 successfully from my Windows 7 computer. My PC is connected to the router by a physical CAT5 cable, not via wireless. Every device connected to my router, including the new computer, can get to the Internet with no problem. Here are some things that did not solve the problem: I tried turning off IPV6 in Windows. I tried turning off my firewall and antivirus software I tried using https instead of http I tried disabling and then enabling the network connection in Windows I tried reverting my network card driver back to an older version I have tried both Firefox and Internet Explorer web browsers. Has anyone experienced something like this before, and solved it? Thanks a lot for your help!

    Read the article

  • How to fix /etc/ folder on Mac OS X

    - by justinhj
    I was following a tutorial which had this command to create a launchd.conf file in /etc/ sudo echo "some command" /etc/launchd.conf But it wouldn't work, I got permission denied after entering my admin password. So it seemed like the permissions for the link were wrong, so I did 'sudo chmod 755 /etc/' But now I can't load a terminal, I get the error The administrator has set your shell to an illegal value If I tried to sudo a command now I get sudo: can't open /private/etc/sudoers: Permission denied sudo: no valid sudoers sources found, quitting Process tramp/sudo root@localhost exited abnormally with code 1 This is what the link /etc looks like, what should it look like, and how do I restore it? lrwxr-xr-x 1 root wheel 11 Jul 21 2011 etc - private/etc /private/etc ... drw-r--r-- 111 root wheel 3774 Mar 26 02:25 etc edit: I'm using Mac OS X 10.7.3

    Read the article

  • root folder php scripts not running in nginx

    - by Thermionix
    nginx with php-fpm on ubuntu 12.04 server. attempting to access /var/www/test.php (via https://example.net/test.php) downloads the script instead of executing it. if I place the test.php in a subdirectory, i.e. /var/www/test/test.php it executes. root.conf; root /var/www; include php-fpm.conf; location ~ /\. { access_log off; log_not_found off; deny all; } php-fpm.conf; location ~ \.php$ { try_files $uri =404; fastcgi_pass unix:/var/run/php5-fpm.socket; include fastcgi_params; } fastcgi_params; fastcgi_param QUERY_STRING $query_string; fastcgi_param REQUEST_METHOD $request_method; fastcgi_param CONTENT_TYPE $content_type; fastcgi_param CONTENT_LENGTH $content_length; fastcgi_index index.php; fastcgi_param HTTPS on; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; #fastcgi_param SCRIPT_FILENAME $request_filename; fastcgi_param SCRIPT_NAME $fastcgi_script_name; fastcgi_param REQUEST_URI $request_uri; fastcgi_param DOCUMENT_URI $document_uri; fastcgi_param DOCUMENT_ROOT $document_root; fastcgi_param SERVER_PROTOCOL $server_protocol; fastcgi_param GATEWAY_INTERFACE CGI/1.1; fastcgi_param SERVER_SOFTWARE nginx/$nginx_version; fastcgi_param REMOTE_ADDR $remote_addr; fastcgi_param REMOTE_PORT $remote_port; fastcgi_param SERVER_ADDR $server_addr; fastcgi_param SERVER_PORT $server_port; fastcgi_param SERVER_NAME $server_name; # PHP only, required if PHP was built with --enable-force-cgi-redirect fastcgi_param REDIRECT_STATUS 200;

    Read the article

  • Alias wordpress folder from within another website

    - by Bretticus
    I have a little dilemma. I wrote a custom PHP MVC framework and built a CMS on top of it. I decided to give nginx+fpm a whirl too. Which is the root of my dilemma. I was asked to incorporate a wordpress blog into my website (yah.) It has much content and it's not feasible in the short amount of time I have to bring all the content into my CMS. Because of using Apache for years, I'm, admittedly, a little lost using nginx. My website has the file path: /opt/directories/mysite/public/ The wordpress files are located at: /opt/directories/mysite/news/ I know I just need to setup location(s) that targets /news[/*] and then forces all matching URI's to the index.php within. Can someone point me in the right direction perhaps? My configuration is below: server { listen 80; server_name staging.mysite.com index index.php; root /opt/directories/mysite/public; access_log /var/log/nginx/mysite/access.log; error_log /var/log/nginx/mysite/error.log; add_header X-NodeName directory01; location = /favicon.ico { log_not_found off; access_log off; } location = /robots.txt { allow all; log_not_found off; access_log off; } location / { try_files $uri $uri/ /index.php?route=$uri&$args; } location ~ /news { try_files $uri $uri/ @news; } location @news { fastcgi_pass unix:/tmp/php-fpm.sock; fastcgi_split_path_info ^(/news)(/.*)$; fastcgi_param SCRIPT_FILENAME /opt/directories/mysite/news/index.php; fastcgi_param PATH_INFO $fastcgi_path_info; } include fastcgi_params; include php.conf; location ~* ^.+.(jpg|jpeg|gif|css|png|js|ico|xml)$ { access_log off; expires 30d; } ## Disable viewing .htaccess & .htpassword location ~ /\.ht { deny all; } } My php.conf file: location ~ \.php { fastcgi_param QUERY_STRING $query_string; fastcgi_param REQUEST_METHOD $request_method; fastcgi_param CONTENT_TYPE $content_type; fastcgi_param CONTENT_LENGTH $content_length; fastcgi_param SCRIPT_NAME $fastcgi_script_name; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; fastcgi_param REQUEST_URI $request_uri; fastcgi_param DOCUMENT_URI $document_uri; fastcgi_param DOCUMENT_ROOT $document_root; fastcgi_param SERVER_PROTOCOL $server_protocol; fastcgi_param GATEWAY_INTERFACE CGI/1.1; fastcgi_param SERVER_SOFTWARE nginx; fastcgi_param REMOTE_ADDR $remote_addr; fastcgi_param REMOTE_PORT $remote_port; fastcgi_param SERVER_ADDR $server_addr; fastcgi_param SERVER_PORT $server_port; fastcgi_param SERVER_NAME $server_name; fastcgi_pass unix:/tmp/php-fpm.sock; # If you must use PATH_INFO and PATH_TRANSLATED then add # the following within your location block above # (make sure $ does not exist after \.php or /index.php/some/path/ will not match): #fastcgi_split_path_info ^(.+\.php)(/.+)$; #fastcgi_param PATH_INFO $fastcgi_path_info; #fastcgi_param PATH_TRANSLATED $document_root$fastcgi_path_info; } fastcgi_params file: fastcgi_connect_timeout 60; fastcgi_send_timeout 180; fastcgi_read_timeout 180; fastcgi_buffer_size 128k; fastcgi_buffers 4 256k; fastcgi_busy_buffers_size 256k; fastcgi_temp_file_write_size 256k; fastcgi_intercept_errors on; Thanks, in large part, to @Kromey, I have adjusted my location /news/ but I am still not getting the desired result. I was able to learn to tack a ~ my /news location as I discovered that my php location was being matched first. With this setup, I now get a 200 status, but the page is blank. Any ideas?

    Read the article

  • Looking to get a small server – need web, PHP, PostgreSQL.

    - by Javawag
    Hi all! I'm looking to get a cheap (low end) server to serve web pages (xHTML/PHP), but I also need to be able to set up PostGreSQL on the system too. Ideally the server would have low power consumption, run Linux (I prefer Mac OS X but a Mac Mini, although the size I'm looking for, is too much money!) and be around £100 (~$160US). EDIT: Just to make it clearer, I'm looking to purchase the server hardware myself – but I want something about Mac Mini sized. I don't want to pay for hosting! Also, quick question – if it's to serve web pages from my home (standard ISP connection, no static IP!), what do I need in place to get this working. I'm guessing I would sign up with some service like no-ip, and register a domain to point to my no-ip address (then install the no-ip software on the server to update that with the current IP). I know the idea of running a server behind a normal ISP connection isn't very elegant, but I'd prefer to have the server where I can see it then pay over the odds for a hosting service where I have little to no control over what happens. Also, I could write my own server software for apps/etc to connect to as well. Anyways I'm rambling! What do you guys think?! Javawag

    Read the article

  • Folder default ACLs not inherited when new file is created

    - by Flavien
    I'm a bit of a beginner with Unix systems, but I'm running Cygwin on my Windows Server, and I am trying to figure out something related to extended ACLs. I have a directory to which I set the following ACLs: Administrator@MyServer ~ $ setfacl -m d:u:Someuser:r-- somedir Administrator@MyServer ~ $ getfacl somedir/ # file: somedir/ # owner: Administrator # group: None user::rwx group::r-x mask:rwx other:r-x default:user::rwx default:user:Someuser:r-- default:group::r-x default:mask:rwx default:other:r-x As you can see mose of the default ACLs have the x bit. Then when I create a fine in it, it doesn't inherit the ACLs it is supposed to: Administrator@MyServer ~ $ touch somedir/somefile Administrator@MyServer ~ $ getfacl somedir/somefile # file: somedir/somefile # owner: Administrator # group: None user::rw- user:Someuser:r-- group::r-- mask:rwx other:r-- It's basically missing the x bit everywhere. Any idea why?

    Read the article

  • Samba PDC plus universal folder

    - by skids89
    I know how to configure samba on my ubuntu box to become a PDC however I need some select files to be accessible to multiple users. These files are beyond their personal files. I.E. users A-C need to be able to access a schedule saved as a spreadsheet. But user D does not and users B-D need to be able to access confidential employee info but user A does not. How do I set this up on top of the PDC structure? Any video tutorials would be a plus. Im new to linux so documentation is a confusing slow slog to learn. Thanks so much in advance!

    Read the article

  • Need Help Scoping a Server to use for study (MCITP Ent Admin + SharePoint 2010)

    - by AVFamily76
    i need to study for mcitp, but i also need to study for sharepoint 2010 i have a poweredge 1850 with two single-core CPUs + two 73G drives - it kills me on electricity, so don't want to use it, and it won't do VT, but it could be one of three boxes for a lab that's cheap, but will cost a lot on electricity i was thinking . . . OPTION #1 Opteron 4170 HE (50 watt chip), 6-core, only two-bills ($200), but the board's are $250, so that's an $800 box, then get another box to dual-boot Win7/Hyper-V on the cheap...? OPTION #2 Used Quad - but how many VM's that are really banging away could it run at same time? (Server 2008r2, SQL 2008r2, Search Server) OPTION #3 Study from books and just get one box that can run two VM's at same time, even if slowly. the last time i had and used a home lab was five years ago when i had a DC, SQL, Exchange and business app box, that's where i got my server skills was just banging on it for four years, but didn't read any books, so now i have to get certified and know the material, and just am not sure how much attention i should pay to the box i use versus the studying time and reading. sorry it's a subjective question, and am obviously open to all sorts of abuse here, but hope you can tell me also how many VM's i can run at the same time given what they'll be doing (SQL and SharePoint FAST search server are resource hungry) thanks!

    Read the article

  • Redirecting pages from the root folder to a subfolder

    - by MarcoPRT
    I have a Joomla site in the root directory of my domain, and I have a forum at /forum subdirectory. How can I redirect visitors from the main site to the forum, continuing to have the possibility to access the site from a link at the forum? Example: http://example.com redirected to http://example.com/forum , but I can access the main site by the link http://example.com/index.php

    Read the article

  • Linux - File was deleted and then reappeared when folder was zipped

    - by davee9
    Hello, I am using Backtrack 4 Final, which is a Linux distro that is Ubuntu based. I had a directory that contained around 5 files. I deleted one of the files, which sent it to the trash. I then zipped the directory up (now containing 4 files), using this command: zip -r directory.zip directory/ When I then unzipped directory.zip, the file I deleted was in there again. I couldn't believe this, so I zipped up the directory again, and the file reappeared again but this time could not be opened because the operating system said it didn't exist or something. I don't remember the exact error, and I cannot make this happen again. Would anyone happen to know why a file that was deleted from a directory would reappear in that directory after it was zipped up? Thank you.

    Read the article

  • Too Many Files In Debian Linux Folder?

    - by Dave Potts
    I've been using an external USB drive on a Debian server for backup. The drive is formatted as NTFS and mounted with ntsfmount. This was working fine, but I was filling up a directory with lots of files. Eventually the backup failed. When I then tried to look at the directory using ls it reported: ls: reading directory .: Numerical result out of range Looking in syslog, I also saw this: Sep 23 07:35:31 tosh ntfsmount[28040]: Failed to read index block: Numerical result out of range. Is this simply that I've reached the upper limit of number of files in a directory? If so, is there any way to extend the number of allowed files?

    Read the article

  • Nginx deny doesn't work for folder files

    - by user195191
    I'm trying to restrict access to my site to allow only specific IPs and I've got the following problem: when I access www.example.com deny works perfectly, but when I try to access www.example.com/index.php it returns "Access denied" page AND php file is downloaded directly in browser without processing. I do want to deny access to all the files on the website for all IPs but mine. How should I do that? Here's the config I have: server { listen 80; server_name example.com; root /var/www/example; location / { index index.html index.php; ## Allow a static html file to be shown first try_files $uri $uri/ @handler; ## If missing pass the URI to front handler expires 30d; ## Assume all files are cachable allow my.public.ip; deny all; } location @handler { ## Common front handler rewrite / /index.php; } location ~ .php/ { ## Forward paths like /js/index.php/x.js to relevant handler rewrite ^(.*.php)/ $1 last; } location ~ .php$ { ## Execute PHP scripts if (!-e $request_filename) { rewrite / /index.php last; } ## Catch 404s that try_files miss expires off; ## Do not cache dynamic content fastcgi_pass 127.0.0.1:9001; fastcgi_param HTTPS $fastcgi_https; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; include fastcgi_params; ## See /etc/nginx/fastcgi_params } }

    Read the article

  • Replacement for public folder workflow, I'm confused as to how sharepoint does it.

    - by RodH257
    For years Microsoft has been slowly phasing out public folders, perhaps exchange 2010 really is the LAST TIME they'll be shipped... I've heard sharepoint is the replacement, but I don't understand full, can someone give me an idea of how to replace this workflow? In our office, we have projects, they have a project number, ie 10353. Each job folder has a public folder, organized in a hierachy like Projects Year Folder Subfolders The main subfolder we use is for genera correspondence. When an email is received that relates to a project, it is dragged and dropped (or right click move to) a public folder. Adding public folder favourites for each user helps this. When an email is sent, we have a custom email form, which is the default email form, but with a project number field next to the subject line. When you enter the job number in there, it carbon copies our filing system in, which reads the job number and puts the email in the public folder for you. if you need to refer to emails, you go to public folder and find them there. This isn't the best with large jobs, but it works ok. Now, I have limited experience with sharepoint (well, WSS), we've used it to do some neat discussion boards/polls etc as an intranet site, but I haven't seen much of its integration with outlook. The great thing about our solution is how tightly it integrates with outlook which is exactly where the emails are. If you want to forward an old email, you go to public folder and forward it, simple. Any solution that replaces it should be at least as easy as this. Improvements we would like would be to have better searching of emails, better support in exchange (ie future version) and also, custom forms in outlook are being phased out (the VBA kind), so avoiding these would be good. Does sharepoint do this? or what solutions do this kind of thing?

    Read the article

  • Windows 7 can't find Ubuntu computer by hostname

    - by endolith
    I got a new Windows 7 machine, and was using VNC, SSH etc to connect to my Ubuntu machine, and it worked fine previously connecting to the Ubuntu computer's hostname. Now it doesn't work if I use the machine's hostname, but it does if I use the local IP or DynDNS name. I can also access it from my Android phone using the local hostname over SSH. If I try to connect with SSH to the hostname, it says "Host does not exist". VNC says "Failed to get server address". NX says "no address associated with name", and I don't see it in Windows' "Network" folder. I've rebooted everything. I've turned off Windows firewall. It was working fine a few days ago, but now it's not. How do I figure out what's blocking it? Aha: It probably has something to do with Samba. I reset the Samba configuration the other day, and apparently this can affect it. http://ubuntu-virginia.ubuntuforums.org/showthread.php?t=1558925 I tried commenting out "encrypt passwords = No" as described there, but it still doesn't work.

    Read the article

  • Windows folder encryption

    - by Razor
    My situation I know that bitlocker is meant to encrypt whole drives, but I have an hard drive that is already fully partitioned and containing data. I'd like to encrypt part of one partition, leaving the rest of the partition accessible. I would very much like to avoid programs like Norton partition magic (which resize/split partitions), because every time I used them I had problems with the data stored. Question Is there any way/builtin alternative/3rd party app that integrates with windows login to encrypt one subset of a partition? EDIT I heard horror stories about EFS, which is why I don't want to use it, unless there have been improvements on reliability with windows 8. Some highlights from that article: In fact I’ve only used EFS twice in the last ten years on my own computers and on both occasions I’ve lost files and documents. I therefore cannot recommend you ever encrypt your files with this Windows feature. Unfortunately, because of incompatibilities with some differing versions of EFS files can end up scrambled and unrecoverable.

    Read the article

< Previous Page | 94 95 96 97 98 99 100 101 102 103 104 105  | Next Page >