Search Results

Search found 6907 results on 277 pages for 'smart folders'.

Page 226/277 | < Previous Page | 222 223 224 225 226 227 228 229 230 231 232 233  | Next Page >

  • Static file serving only works if root is a subfolder under public

    - by lulalala
    I am trying to serve static cache files using nginx. There are index.html files under the rails_root/public/cache directory. I tried the following configuration first, which doesn't work: root <%= current_path %>/public; # $uri always contains one slash(the first slash but not the last) try_files /cache$uri/index.html /cache$uri.html @rails; This give error: [error] 4056#0: *13503414 directory index of "(...)current/public/" is forbidden, request: "GET / HTTP/1.1" I then tried root <%= current_path %>/public/cache; # $uri always contains one slash(the first slash but not the last) try_files $uri/index.html $uri.html @rails; And to my surprise this works. Why is it that I can do the latter not the former( since they point to the same location) The permissions of the folders are: 775 public 755 cache 644 index.html The thing is that my favicon sitting under public/ is served correctly: # asset server server { listen 80; server_name assets.<%= server_name %>; expires max; add_header Cache-Control public; charset utf-8; root <%= current_path %>/public; }

    Read the article

  • Is it possible to re-cab an Administrative Install Point?

    - by Nathaniel Bannister
    We have Acrobat 8 Pro at work, and our media was painfully out of date. Rather than install all of the machines at 8.0.0 and then do the 6 or 7 consecutive reboots adobe expects you to be ok with I decided I'd integrate the .msp files into the installer. After reading up on it, I figured out the exact patch order that adobe required, extracted my cd to an Administrative install point, and ran the patches against it: msiexec /a AcroPro.msi /p AcrobatUpd810_efgj_incr.msp TARGETDIR="C:\Acrobat8" /log "output.log" msiexec /a AcroPro.msi /p AcrobatUpd811_all_incr.msp TARGETDIR="C:\Acrobat8" /log "output.log" msiexec /a AcroPro.msi /p AcrobatUpd812_all_incr.msp TARGETDIR="C:\Acrobat8" /log "output.log" msiexec /a AcroPro.msi /p AcrobatUpd813_all_incr.msp TARGETDIR="C:\Acrobat8" /log "output.log" msiexec /a AcroPro.msi /p AcrobatUpd816_all_incr.msp TARGETDIR="C:\Acrobat8" /log "output.log" msiexec /a AcroPro.msi /p AcrobatUpd817_all_incr.msp TARGETDIR="C:\Acrobat8" /log "output.log" msiexec /a AcroPro.msi /p AcrobatUpd820_all_incr.msp TARGETDIR="C:\Acrobat8" /log "output.log" msiexec /a AcroPro.msi /p AcrobatUpd822_all_incr.msp TARGETDIR="C:\Acrobat8" /log "output.log" msiexec /a AcroPro.msi /p AcrobatUpd823_all_incr.msp TARGETDIR="C:\Acrobat8" /log "output.log" msiexec /a AcroPro.msi /p AcrobatUpd825_all_incr.msp TARGETDIR="C:\Acrobat8" /log "output.log" msiexec /a AcroPro.msi /p AcrobatUpd826_all_incr.msp TARGETDIR="C:\Acrobat8" /log "output.log" Now I have a AIP that is fully patched to 8.2.6 (Tested working prior to attempting to CAB it), but is absolutely huge (1.2gb) what I would like to do is take the folders within the AIP and put them back into a cab file for the sake of convenience in transferring the files around. I tried the command: cscript "C:\Program Files\Microsoft SDKs\Windows\v7.0\Samples\sysmgmt\msi\scripts\WiMakCab.vbs" AcroPro.msi Data1 /L /C /S Per the guide I was using, while this did produce the cab file I Wanted, however the resulting MSI fails to install with an error 2602: It's been a while since I've done something like this, and it's probably a glaring oversight on my part, but any insight would be much appreciated.

    Read the article

  • RHEL 5.3 Kickstart - How specify location of individual package in Workstation folder?

    - by Ed
    I keep getting "package does not exist" errors during the install. I made a kickstart ISO to create an unattended install of a RHEL 5.3 build machine for C++ software releases. It pulls the kickstart config file from our internal web server. This is handy; it makes it easy to test and modify without having to make a new ISO. And I plan to check it in to version control if I can get it working. Anyway, the rpm packages are located in two folders on the disk; Client and Workstation. The packages install fine for the ones that are physically located under the Client folder. It cannot find those under the Workstation folder such as as doxygen and subversion complaining that packages do not exist. Is there a way to specify the individual package location? # ----------------------------------------------------------------------------- # P A C K A G E S # ----------------------------------------------------------------------------- %packages @gnome-desktop @core @base @base-x @printing @development-tools emacs kexec-tools fipscheck xorg-x11-server-Xnest xorg-x11-server-Xvfb #Packages Located in Workstation Folder *** Install can not find any of these ?? bison doxygen gcc-c++ subversion zlib-devel freetype-devel libxml2-devel Thanks in advance, -Ed

    Read the article

  • Windows 7 - How to access my documents from Windows 8 (dual boot)

    - by msbg
    I am dual booting Windows 7 and Windows 8 on two different partitions of the same drive: Win7: (C:) Win8: (D:) I am trying to get access to my Win7 user folder (C:\Users\Mason) in order to access my Win7 documents folder (C:\Users\Mason\Documents) from Windows 8. When I try to on Windows 8, I get an error message saying "You don't have permission to access this folder. Click here to permanently get access to this folder". When I click, the progress bar in Windows Explorer slowly moves to the maximum and disappears. When I try opening the folder, I get the same error message. When editing security permissions for the folder in Windows 8, Explorer freezes. I do not know how to remove the restrictions from Windows 7. I checked the Windows 8 user folder (D:\Users\Mason) and it had the group or user name: "S-1-5-21-936898901-3363470404-1273668825-1001". I tried copying and pasting it into the Win7 User Folder Permissions, but got the error "An object with the following name cannot be found". How would I access my folders?

    Read the article

  • Windows 7 - mysteriously missing free HDD space

    - by sYnfo
    I have Windows7 installed on 50GB (Oops, it should have been 45GB, sorry) partition, and every now and then it gets full, and I have to resize that partition. I always thought it is quite normal. But it happened again today and this time, I'm sure it is not normal, because since last resizing (35GB 45GB) I did not install any new apps or whatever. Also, sum of sizes off all, including hidden & system, root folders and files is ~18GB, yet windows is indicating that all 50GB are used up... Any idea what is going on? EDIT: Great tools everyone! (SourceForge appears to be offline at the moment, I'll check WinDirStat later) Alas, non of them solved my problem just yet... Screenshot from SpaceSniffer: On the right there is some kind of "Unknows Space", any idea what that could be? EDIT2: After those two apps failing to help much I didn't expect it, but WinDirStat actually helped. It showed that those missing 27GB are in my Temp folder (Well, that should have been my first guess anyway). There I found hundreds of ~100MB files, named like HTT????.tmp. After some googling it appears to be a problem with ESET NOD32 antivirus and it's ThreatSense feature. Thank you all for help! :)

    Read the article

  • Correcting owner/permissions on damaged directory tree in linux

    - by mcs130
    I inadvertently made a backup copy of a directory recursively and forgot the -a (--preserve) switch when doing so. This damaged my backup directory (which contains data we need to access). The directory and all of its child folders and files comprise an installation of an application including postgress DB and solr files. The original copy was used to for a failed re-config attempt. Now I need to use the backup copy to start over, only the ownership of the backup copy is now root across everything and it is no longer usable (processes won't run due to ownership problems I created when I forgot the -a on the cp -r). I've re-installed a clean copy of the application into a 3rd location now (which has the correct owner/perms) and need to copy the owner/perms from this good directory over onto the damaged directory. What is the best way (if even possible) to do this. (I've Googled and seen things from perl scripting to setfacl/getfacl to do this but am unfortunately still confused). Apologies if this seems a dumb question. Thanks.

    Read the article

  • Mac Share Points automatically authenticate with matching Windows AD credentials from Windows

    - by Ron L
    I recently started administering an OS X server (10.8) that is on the same network as our AD domain. While setting up Mac Share Points, I encountered some odd behavior that I hope someone can explain. For the purposes of this example assume the following: 1) Local User on OS X Server: frank, password: Help.2012 2) AD Domain User: frank, password: Help.2012 3) AD Domain: mycompany 4) OS X Server hostname: macserver (not bound to AD, not running OD) When joined to the domain on a a Win 7 computer and logged in as frank and accessing the shares at \\macserver, it automatically authenticates using frank's OS X credentials (because they are the same). However, if I change frank's OS X password, the standard Windows authentication dialog pops-up preset to use frank's AD domain (my company\frank). However, after entering the new OS X password, it will not authenticate without changing the domain to local (.\frank). Basically, if a user in AD has the same User name and password in OS X, it will authenticate automatically regardless of the domain. If the passwords differ, authenticating to the OS X shares must be done from the local machine. (and slightly off topic - how come an OS X administrator can access the root drives on the Mac server from Windows when accessing the Mac shares even when they aren't shared? In other words, it will show all the shared folders from "File Sharing" plus whatever drives are mounted in OS X)

    Read the article

  • Can't resolve offline file conflicts

    - by Bryan
    We use roaming profiles on our Server 2008 R2 domain, with folder redirection for 'desktop', 'my documents' and 'application data'. But as our network is split across two sites, we have one file server at each site, which are configured to use domain based DFS namespaces and DFS replication to keep things in sync. The DFS path for the replication folder is as follows: \\domain\folderredirection$\<username>\<redirected-folder-name> The real paths are \\site-1-server\folderredirection$\<username>\<redirected-folder-name> and \\site-2-server\folderredirection$\<username>\<redirected-folder-name> As our users all switch between sites (sometimes several time per day), our folder redirection policy has to redirect to the DFS roots rather than hardcoded to a specific server. Both DFS and DFS-R have been proven to be working perfectly. On our laptops, we use offline files for the redirected folders, and this also works fine, however the problem is as follows: When conflicts occur in offline files, it is impossible to resolve the conflicts. I'm given the usual conflict resolution options (i.e. 'Ignore', 'Keep Both', 'Keep network' and 'Keep local'), however, not one of these options will resolve any conflict, yet no error is produced. We only use offline files on laptops, which have either Windows XP Professional or Windows 7 Professional installed. The problem is not specific to any one laptop, it affects every laptop and every conflicting file in exactly the same way. I would have thought the set up we have is common for companies that have multiple sites, so I'm hoping someone will have seen this before?

    Read the article

  • Archive software for big files and fast index

    - by AkiRoss
    I'm currently using tar for archiving some files. Problem is: archives are pretty big, contains many data and tar is very slow when listing and extracting. I often need to extract single files or folders from the archive, but I don't currently have an external index of files. So, is there an alternative for Linux, allowing me to build uncompressed archive files, preserving the file attributes AND having fast access list table? I'm talking about archives of 10 to 100 GB, and it's pretty impractical to wait several minutes to access a single file. Anyway, any trick to solve this problem is welcome (but single archives are non-optional, so no rsync or similar). Thanks in advance! EDIT: I'm not compressing archives, and using tar I think they are too slow. To be precise about "slow", I'd like that: listing archive content should take time linear in files count inside the archive, but with very little constant (e.g. if a list of all the files is included at the head of the archive, it could be very fast). extraction of a target file/directory should (filesystem premitting) take time linear with the target size (e.g. if I'm extracting a 2MB PDF file in a 40GB directory, I'd really like it to take less than few minutes... If not seconds). Of course, this is just my idea and not a requirement. I guess such performances could be achievable if the archive contained an index of all the files with respective offset and such index is well organized (e.g. tree structure).

    Read the article

  • Thunderbird alerts when expected email does not arrive

    - by user871199
    I am on Ubuntu 12.04 using Thunderbird as email client. Both are up to date in terms of updates. I have bunch of nightly jobs that do the work and send a status mail. It gets tedious if you keep getting same/similar mails every day so I ended up writing a mail filter rule which causes emails to end up in their respective folders automatically. If things are going ok, I really don't need to read emails. Failure emails are sent to different alias - if the job runs. We recently discovered that one of the job had not run for few days as someone accidentally disabled it. In order to avoid such problems in future, I would like to setup thunderbird in such a way that if I don't get email from given address within given duration, it should alert me. My dream solution is to set up frequency - some jobs do run every 4 hours. Is this possible? Can I setup Thunderbird (preferred) or other email client for reminding me when expected email does not show up. Based on comments and answer I received, here are the reasons why I would like to use Thunderbird. We are already using Thunderbird. It has calender support via plugin, so I suppose someone is already watching time to remind us about the event. May be this another type of event. Additional job is one more failure point, may complicate life if it has to monitor multiple hosts. Additional tools - same thing, one more failure point. Thunderbird can be run across all the platforms we are using - Windows and Ubuntu. It sort of becomes platform independent solution.

    Read the article

  • Dual pane file manager for Mac OS

    - by Alex Kaushovik
    Is there a good customizable dual-pane file manager for Mac like Total Commander / Far Manager in Windows, or like Krusader / Midnight Commander in Linux? I used to work on Windows for quite a while and mostly used Far Manager and sometimes Total Commander, then I switched to Ubuntu Linux and used Krusader, now I switched to Mac OS (Snow Leopard) and I'm having a hard time trying to find a good file manager... Many of the existing applications are trying to replace the Finder with "multimedia capabilities nobody cares about in file manager - IMHO" (Path Finder, ForkLift), some of them are almost good dual-pane file managers (couldn't remember examples), but none of them worked for me mostly because of one reason: I couldn't integrate my file/folder comparison utility (Araxis Merge for Mac) with them... The way it worked for me in Windows and Linux is that I was setting the cursor on one file in the left pane, then setting the right-pane cursor on another file in right pane, then I pressed a hotkey that launched Araxis Merge with those to files/folders comparison results. It was very easy to set up in Far Manager (Windows) and Krusader (Linux, actually in Linux I used "Meld" instead of Araxis Merge...) The tool I'm looking for doesn't necessarily has to be free... Thank you!

    Read the article

  • Dual pane file manager for Mac OS

    - by Alex Kaushovik
    Is there a good customizable dual-pane file manager for Mac like Total Commander / Far Manager in Windows, or like Krusader / Midnight Commander in Linux? I used to work on Windows for quite a while and mostly used Far Manager and sometimes Total Commander, then I switched to Ubuntu Linux and used Krusader, now I switched to Mac OS (Snow Leopard) and I'm having a hard time trying to find a good file manager... Many of the existing applications are trying to replace the Finder with "multimedia capabilities nobody cares about in file manager - IMHO" (Path Finder, ForkLift), some of them are almost good dual-pane file managers (couldn't remember examples), but none of them worked for me mostly because of one reason: I couldn't integrate my file/folder comparison utility (Araxis Merge for Mac) with them... The way it worked for me in Windows and Linux is that I was setting the cursor on one file in the left pane, then setting the right-pane cursor on another file in right pane, then I pressed a hotkey that launched Araxis Merge with those to files/folders comparison results. It was very easy to set up in Far Manager (Windows) and Krusader (Linux, actually in Linux I used "Meld" instead of Araxis Merge...) The tool I'm looking for doesn't necessarily has to be free... Thank you!

    Read the article

  • Emails sent to outlook.com not being delivered

    - by imukcedup
    I'm having an issue that is a little strange. I have a cPanel webserver that I own and have root. I was testing out emailing and noticed some issues. When I send an email to outlook.com address the email sends ok but nothing is recieved at the outlook mailbox. I also dont get an 'email delivery failure notification' in any mailbox. 2014-06-12 09:53:47 SMTP connection from [127.0.0.1]:45334 (TCP/IP connection count = 1) 2014-06-12 09:53:47 1Wv5Rr-0003rA-2K <= [email protected] H=localhost (ourdomain.com) [127.0.0.1]:45334 P=esmtpa A=dovecot_login:joe S=667 [email protected] T="This is a test message" for [email protected] 2014-06-12 09:53:47 SMTP connection from localhost (ourdomain.com) [127.0.0.1]:45334 closed by QUIT 2014-06-12 09:53:50 cwd=/var/spool/MailScanner/incoming/1029481 5 args: /usr/sbin/exim -C /etc/exim_outgoing.conf -Mc 1Wv5Rr-0003rA-2K 2014-06-12 09:53:50 1Wv5Rr-0003rA-2K SMTP connection outbound 1402581230 1Wv5Rr-0003rA-2K ourdomain.com [email protected] 2014-06-12 09:53:50 1Wv5Rr-0003rA-2K => Test Account <[email protected]> R=archive_outgoing_email T=archiver_outgoing 2014-06-12 09:53:52 1Wv5Rr-0003rA-2K => [email protected] R=dkim_lookuphost T=dkim_remote_smtp H=mx1.hotmail.com [65.54.188.110] X=UNKNOWN:AES128-SHA256:128 C="250 <[email protected]> Queued mail for delivery" 2014-06-12 09:53:52 1Wv5Rr-0003rA-2K Completed I have checked the outlook.com's spam folders and its not in there either. This is a new IP address allocation from our ISP and there was a block on gmail addresses, so we know it was used for spam. But with gmail we got a notifaction of failure and I know outlook/microsoft also send out notification. Does anyone know what could be happening here? Thanks

    Read the article

  • Use shared 404 page for virtual hosts in Nginx

    - by Choy
    I'd like to have a shared 404 page to use across my virtual hosts. The following is my setup. Two sites, each with their own config file in /sites-available/ and /sites-enabled/ www.foo.com bar.foo.com The www directory is set up as: www/ foo.com/ foo.com/index.html bar.foo.com/ bar.foo.com/index.html shared/ shared/404.html Both config files in /sites-available are the same except for the root and server name: root /var/www/bar.foo.com; index index.html index.htm index.php; server_name bar.foo.com; location / { try_files $uri $uri/ /index.php; } error_page 404 /404.html; location = /404.html { root /var/www/shared; } I've tried the above code and also tried setting error_page 404 /var/www/shared/404.html (without the following location block). I've also double checked to make sure my permissions are set to 775 for all folders and files in www. When I try to access a non-existent page, Nginx serves the respective index.php of the virtual host I'm trying to access. Can anyone point out what I'm doing wrong? Thanks!

    Read the article

  • Mac OS X software always order files alphabetically rather than by type.

    - by george
    I have noticed many Mac applications sort the files alphabetically rather than by type. A good example would be Coda by panic.com. The files in the file menu are organized alphabetically. I requested for them to add the feature to organize files by type, and they've said that it's a Finder thing. So I looked at other applications to see if they were organizing by type. I noticed Dreamweaver CS4 had this same problem and now including Dreamweaver CS5. There has to be something in the Mac that does this and that I can modify. I played with Spotlight and it now displays its files by type (thinking that's what I can do) but it didn't take effect in other applications. What library are these applications using to display a file menu for their files? here is an example-- file menu layout of coda by panic.com. (i couldnt post another link because it wouldnt let me). can you see how everything is organised alphabetically rather than by folder? i just want the file menu to show all folders first then all the files. 1) http://www.iaddesign.com/coda.png there must be a way to modify mac to let me to do this.

    Read the article

  • Batch deletion of smaller files from group of files via unix command line

    - by artlung
    I have a large number (more than 400) of directories full of photos. What I want to do is to keep the larger sizes of these photos. Each directory has 31 to 66 files in it. Each directory has thumbnails, and larger versions, plus a file called example.jpg I dispatched the example.jpg file easily with: rm */example.jpg I initially thought that it would be easy to delete the thumbnails, but the problem is they are not consistently named. The typical pattern was photo1.jpg and photo1s.jpg. I did rm */photo*s.jpg but it ended up some of the files named photoXs.jpg were actually larger and not smaller. Argh. So what I want to do is scan each directory for filesize and delete (or move) the thumbnails. I initially thought I'd just ls -R every file and extract the size of each file and save those under a threshold. The problem? In one directory the large will be 1.1 MB and the thumb is 200k. In another the large is 200k and the small 30k. Even worse, the files really are mostly named photo1.jpg - so simply putting them all in the same folder, sorting by size, and deleting in groups would not work without renaming already, and if it's possible I'd prefer to keep them in their folders. I was almost resolved to just doing this all manually, but then thought I'd ask here. How would you do this task?

    Read the article

  • How to configure VirtualBox server for performance at home

    - by BluJai
    I currently have two physical Ubuntu Server 10.10 servers at home: one serves as our firewall/router/DHCP/VPN server and the other performs double-duty as a file server and a VirtualBox host for an Ubuntu Desktop 10.10 machine which I use from remote connections (via NoMachine) for many thin-client purposes which are irrelevant to my question. What I'd like to accomplish is to consolidate the two physical machines into one which is a dedicated VirtualBox host (most likely running Ubuntu Server 10.10). Note that I'd like to stick with VirtualBox (if possible) because I'm most comfortable with it and use it on a daily basis at both home and work. Specifically, I plan to have one VM set up as file server, another as the firewall/router/DHCP/VPN (or possibly split those a bit) and a third, which is the only current VM (already VirtualBox), which is the thin-client host. My question comes down to performance and/or recommendations about the file server VM. The file server hosts about 6 terabytes of data across 4 drives. What I'd like to do is use raw disk access from the VM directly to the existing disks. However, I'm curious what performance advantage/disadvantage that would have as compared to using shared folders from the VM host and basically just have the whole drive served as a shared folder to the VM which would then serve it to the other machines on the network. I don't know if virtual disks would even work in this scenario and I certainly wouldn't want a drive to be filled with just a single file which is 1.5 TB (disk image). To add understanding of context, but not to get additional advice, I want to virtualize these machines because I intend to regularly use the snapshot capabilities of VirtualBox for the system disks (which will be virtual drives) of the VMs and I have some physical space/power needs to address (as I mentioned, this is at home).

    Read the article

  • Recovering Data from a Linkstation LS-WXL/R1

    - by kingkool68
    I've been running a Buffalo Linkstation LS-WXL/R1 in RAID1 for a few weeks. Two nights ago we had a brief power outage. Yesterday when I tried to mount the disk it couldn't be found. I logged in to the web admin and it couldn't see any storage attached and no arrays available. Well this sucks. I ended up taking the disks out and mounting them to an Ubuntu virtual machine. They how up as an Array but I can't start the Array to the best of my limited knowledge. I could still see the 6 partitions, so I'm confident the data is there. I can use Photorec (http://www.cgsecurity.org/wiki/PhotoRec) to recover most of the files I want, it just takes some time. Could I make an image of the data partition and mount that in Ubuntu so I can get at the data I want through the file system? 95% of the data on my Linkstation LS-WXL/R1 is backup. I just put a few folders of images on there that I need to get back. I'm already preparing to format the disks when I'm done and re-building the RAID1 array from scratch. Any advice would be appreciated.

    Read the article

  • Windows Share authentication from Active Directory Linux login

    - by Kenny
    I'm using Active Directory to log into RHEL. To do this, I followed the steps outlined here: http://www.markwilson.co.uk/blog/2007/05/using-active-directory-to-authenticate-users-on-a-linux-computer.htm I'd like to be able to read data from Windows Servers shared folders without being prompted for a password. On Windows I log into an AD domain, and when I access windows file shares on a server on the LAN (also part of the AD domain) my I can just access them with no authentication step. I've used SMBclient on Linux to access these shares, but it asks for my password. I would like to be able to script access to the data on the shares, but I can't if there's a password prompt in the way. Well, I could, but it's not how I want to do it. Now, since I'm logged in using my active directory username & password, can't I just access the shares without jumping that extra hoop? I know I can mount the share using something like: //192.168.0.5/share /mnt/windows cifs auto,username=steve,password=secret,rw 0 0 but access will depend who is logged in... each user logging in should have their own unique AD access privelages. Thanks for reading!

    Read the article

  • Windows 7 home backup solution, with offsite provision

    - by Richard E
    I am looking for a home backup solution for my single Windows 7 (Home Premium) PC. I have about 500GB of data to backup. I would like to spend less than GBP 300 on the solution. I don't see the need to backup the whole PC, rather specific folder branches (iTunes, photos, documents, Outlook files, user folders such as desktop, favorites etc). I would like a solution that enables me to maintain backups in two separate physical locations (e.g. home and work). To facilitate this I am imagining a storage unit with slots for two removable drives, along with three separate drives. At any one time two of the drives will be being backed up to in the storage unit. The third will be located at my work. Periodically I will take one of the drives into work and leave it there, then bring the drive that was there back home, and plug it into the storage unit. It will then be backed up along with the other drive that was left in the storage unit. This approach should cover scenarios such as virus attack and fire or theft from one location. Thoughts and comments on the sanity of this approach please...

    Read the article

  • Windows 7 home backup solution, with offsite provision

    - by Richard E
    I am looking for a home backup solution for my single Windows 7 (Home Premium) PC. I have about 500GB of data to backup. I would like to spend less than GBP 300 on the solution. I don't see the need to backup the whole PC, rather specific folder branches (iTunes, photos, documents, Outlook files, user folders such as desktop, favorites etc). I would like a solution that enables me to maintain backups in two separate physical locations (e.g. home and work). To facilitate this I am imagining a storage unit with slots for two removable drives, along with three separate drives. At any one time two of the drives will be being backed up to in the storage unit. The third will be located at my work. Periodically I will take one of the drives into work and leave it there, then bring the drive that was there back home, and plug it into the storage unit. It will then be backed up along with the other drive that was left in the storage unit. This approach should cover scenarios such as virus attack and fire or theft from one location. Thoughts and comments on the sanity of this approach please...

    Read the article

  • Windows Share authentication from Active Directory Linux login

    - by Kenny
    Hi, I'm using Active Directory to log into RHEL. To do this, I followed the steps outlined here: http://www.markwilson.co.uk/blog/2007/05/using-active-directory-to-authenticate-users-on-a-linux-computer.htm I'd like to be able to read data from Windows Servers shared folders without being prompted for a password. On Windows I log into an AD domain, and when I access windows file shares on a server on the LAN (also part of the AD domain) my I can just access them with no authentication step. I've used SMBclient on Linux to access these shares, but it asks for my password. I would like to be able to script access to the data on the shares, but I can't if there's a password prompt in the way. Well, I could, but it's not how I want to do it. Now, since I'm logged in using my active directory username & password, can't I just access the shares without jumping that extra hoop? I know I can mount the share using something like: //192.168.0.5/share /mnt/windows cifs auto,username=steve,password=secret,rw 0 0 but access will depend who is logged in... each user logging in should have their own unique AD access privelages. Thanks for reading!

    Read the article

  • How to configure VirtualBox server for performance at home

    - by BluJai
    I currently have two physical Ubuntu Server 10.10 servers at home: one serves as our firewall/router/DHCP/VPN server and the other performs double-duty as a file server and a VirtualBox host for an Ubuntu Desktop 10.10 machine which I use from remote connections (via NoMachine) for many thin-client purposes which are irrelevant to my question. What I'd like to accomplish is to consolidate the two physical machines into one which is a dedicated VirtualBox host (most likely running Ubuntu Server 10.10). Note that I'd like to stick with VirtualBox (if possible) because I'm most comfortable with it and use it on a daily basis at both home and work. Specifically, I plan to have one VM set up as file server, another as the firewall/router/DHCP/VPN (or possibly split those a bit) and a third, which is the only current VM (already VirtualBox), which is the thin-client host. My question comes down to performance and/or recommendations about the file server VM. The file server hosts about 6 terabytes of data across 4 drives. What I'd like to do is use raw disk access from the VM directly to the existing disks. However, I'm curious what performance advantage/disadvantage that would have as compared to using shared folders from the VM host and basically just have the whole drive served as a shared folder to the VM which would then serve it to the other machines on the network. I don't know if virtual disks would even work in this scenario and I certainly wouldn't want a drive to be filled with just a single file which is 1.5 TB (disk image). To add understanding of context, but not to get additional advice, I want to virtualize these machines because I intend to regularly use the snapshot capabilities of VirtualBox for the system disks (which will be virtual drives) of the VMs and I have some physical space/power needs to address (as I mentioned, this is at home).

    Read the article

  • SharePoint Records Center Submitted E-mail Records not picked up

    - by Kenneth Verburg
    We have set up a new SharePoint 2007 site with a Records Repository. We're using Exchange 2007 Managed Folders to route e-mails to this repository based on the 'label' attached to the e-mail as set in the Exchange 2007 journaling options. E-mails added to a Managed Folder get sent to SharePoint, they end up in the "Submitted E-mail Records" list of the Records Repository. That's according to plan, but the e-mails are not routed to the respective document library as defined by the label. Instead an error appears in the event viewer for every e-mail listed in the Submitted E-mail Records list, on every interval of the records repository schedule (set to every two minutes for testing purposes): Value cannot be null, parameter name: g. Sending a document from the SharePoint site iself to the Records Repository via the Send To... link works fine, but e-mails get stuck in the list... We have set Document Libraries in the Respository with and without content types (with matching names with the Label and the Record Routing rule set). Any ideas what could be wrong? This is in the event log: Every two minutes the following error appears in the Application Log: Source: Office SharePoint Server Category: Records Center Type: Error Event ID: 4975 User: N/A Computer: SPS2007 Description: Value cannot be null. Parameter name: g For more information, see Help and Support Center at http://go.microsoft.com/fwlink/events.asp.

    Read the article

  • How is Apache still working?

    - by PJ
    Recently, I decided to set up a local development environment for my work projects. I'm a PHP developer, with just enough knowledge of Linux and Apache to break things mightily. To get the local environment looking like my work environment, I had to upgrade PHP. When I did, Apache wouldn't restart. I decided I wanted to start fresh (this is where things went wrong) and that I'd reinstall Apache and PHP using MacPorts. So, I went through and tried to delete all the Apache files. Yup. I ran locate apache2 and deleted any folders that looked important. (I know, I know) Then I did a /usr/libexec/locate.updatedb to make sure everything was up to date. I even restarted my machine, just to make sure. The issue is, http://localhost still works. As does an alias I set up, http://butler. Shouldn't they not work? Now that I'm this far in, are there any tips for how to completely remove Apache so I can start over? Worst case, I have a timemachine backup, so I can always just restore that... Thanks in advance.

    Read the article

< Previous Page | 222 223 224 225 226 227 228 229 230 231 232 233  | Next Page >