Search Results

Search found 5490 results on 220 pages for 'shadow folders'.

Page 136/220 | < Previous Page | 132 133 134 135 136 137 138 139 140 141 142 143  | Next Page >

  • Exchange 2007 to 2010 public folder replication error 1129

    - by Keith
    I currently upgrading from an Exchange server 2007 to 2010. I have moved all mailboxes and OAB. I am having issues replicating the public folders. This is the error I'm getting in the event log on the 2007 box: Error 1129 occurred while processing a replication event. Folder: (6-11ED8367F0C) IPM_SUBTREE\Marketing\Marketing I have looked online and everything about these errors seems to relate from an old 2003 server. Well, we never had a 2003 server. I'm really not sure what to do at this point. Any help?

    Read the article

  • Windows 7 Recovery Console File Access Denied

    - by Ty Rozak
    Recently by computer crashed and was stuck in a boot loop. So I created a Windows Recovery CD and booted from that. When I use the command prompt in the recovery console, I cannot see any of my personal files or folders (such a my Users folder with My Documents). Is there a way to access these files? The only reason I would need to fix the computer properly would be to get these files off of the computer and onto a hard drive. Any other fix suggestions would be greatly appreciated. I have tried both system repair and system restore from the Recovery Console, but neither seem to work. Thanks.

    Read the article

  • Installation of Microsoft SQL Server 2008 R2 Developer Edition fails

    - by Yustme
    I'm having a problem installing MS SQL Server 2008 Developer edition on a Vista Ultimate 64 Bit machine. No matter what I try: I uninstalled the previous installation; I deleted all folders that where installed and had to do with SQL Server 2008; I cleared my registry using ccleaner; I tried 'fixit' utility from Microsoft uninstalling left overs; It just keeps failing at installing setup support files with this error message: SQL Server Setup failure SQL Server Setup has encountered the following error: Unknown property. [OK] I'm totally out of ideas. Any one has a suggestion for me to look at?

    Read the article

  • ubuntu automount: only mounting drives as root?

    - by glisignoli
    I'm sharing the /mount dir with smb so users on my network can access use drives added to my linux box. Users are able to read files but not write, modify or delete files or directories. I'm using ubuntu 10.04 server edition with halevt installed for usb auto mounting. Afaik halevt is automounting the drives to /media/ but the drives are showing up as: drwxrwxr-x 1 root root 20480 2010-12-29 20:40 disk drwxrwxr-x 1 root root 24576 2010-12-21 17:20 Sparta mount gives me: /dev/sda1 on /boot type ext2 (rw) /dev/sdb1 on /media/disk type fuseblk (rw,nosuid,nodev,sync,allow_other,blksize=4096,default_permissions) /dev/sdc1 on /media/Sparta type fuseblk (rw,nosuid,nodev,sync,allow_other,blksize=4096,default_permissions) When I umount the drives, the folders /media/disk and /media/Sparta are both removed. I tried changing the permissions with chown to nobody:nogroup but it doesn't work (which I assume is because they are ntfs drives).

    Read the article

  • Finder Sidebar Icons - How do I duplicate?

    - by Wilco
    I've noticed that some system directories, when dragged to the Finder's sidebar, utilize special small-scale icons not visible in any other place. Even when looking at one of these folders in a Finder window using the smallest possible icon size, these "special" icons don't appear (so it's not just the small version of the folder's icon). So my question is, where is this information stored? If I wanted to duplicate this behavior for an arbitrary folder, where would I need to look? I like to replace my home directory with a symlink to a location on another partition, but when I do this, I lose this sidebar icon behavior. I would love to get this back if I can.

    Read the article

  • Keeping local windows folder in sync with remote ftp folder in real time

    - by bobo
    I know it has been asked before, but I would like it to happen in real time and transparently (without the need to open a separate FTP client such as FileZilla). For example, if I edit a text file in the local folder and then save it, it should immediately detect it and push the changes to the remote folder. It can be unidirectional (changes made on the local folder has to be pushed to the remote folder but the reverse is not necessary). It should also be able to specify some excluded files/folders which do not need to be in sync. Is there such an application that you know of?

    Read the article

  • Time Machine (OSX) doesn't back up files in Mount Point or Disk Image File

    - by Chris
    Hi all, I found this Q&A (http://superuser.com/questions/148849/backup-mounted-drive-of-an-image-in-time-machine) and this prompted me to ask the following question: I have two disk images which are scripted to be mounted on login. These two disk images are always mounted to the same location. These two disk images are encrypted TrueCrypt volumes. Time Machine (TM) will only back up the disk images the first time they are mounted, but not after that. As I modify documents within the volumes throughout the day, the modified timestamps are adjusted properly. However, TM does not back them up. TM never backs up the mount points which are two folders within my home directory. Any ideas as to why neither the mount point or the image files are backed up? Do the image files have to be closed (unmounted) after being modified for TM to back them up? Thanks, Chris

    Read the article

  • Two Apache Server Root

    - by Sithu Kyaw
    I am using Apache Friends (XAMPP). I installed it under C: drive. Its path is C:\xampp\ Its default root is C:\xampp\htdocs.Thus, all programs need to reside in C:\xampp\htdocs\ so that we can run http://localhost/myapp/ PhpMyAdmin comes along with XAMPP, but it resides in C:\xampp\ and it can be run from /localhost/phpMyAdmin/. When my application is moved to C:\xampp\, I cannot run it /localhost/myapp. I would like to have two server root C:\xampp\ and C:\xampp\htdocs\ so that I can separate my private apps and public apps in different folders. And both can be run from http://localhost/ such as /localhost/myprivateapp/ and /localhost/mypublicapp/ How can I do that ? I'm on Windows XP.

    Read the article

  • osx 10.6.3 how does apache config work?

    - by w-
    hi, just got a macbook pro 15" so i'm unfamiliar with how the filesystem is laid out. i noticed when in my filesystem that i've got a few paths specifying httpd.conf /etc/apache2/httpd.conf /opt/local/apache2/conf/httpd.conf /private/etc/apache2/httpd.conf the confs are different in lots of ways-- e.g. user, group, server_root, modules that are loaded, etc. the apache2 folders themselves also greatly differ. it seems that the one getting used is either /etc/apache2/httpd.conf or /private/etc/apache2/httpd.conf i'm wondering if i might have messed up my system after installing some packages (php5, django, etc) via macports and maybe ended up with 2 apache2 instances. my questions are hence: - which httpd.conf is the one being used ? - what are the other files for? thanks

    Read the article

  • local cache for NAS or network folder

    - by HugoRune
    I am planning to build a network attached storage (NAS) server. Is there a way to cache frequently acccessed files from the remote storage automatically on the local PC? (I am not looking for a way to sync whole folders like rsync, but rather something that automatically and transparently caches the last accessed 50 gb of files.) Ideally I am searching for something that caches writes as well as reads, since only one pc will be accessing the server (and one day of lost changes if the local cache is damaged would be acceptable) I looked into windows offline files, but as far as I could tell this requires manual interaction to disconnect the server or go into offline mode in order to use the cache. The server would probably be running Linux or freeNAS, the pc runs Windows xp, but could be upgraded to 7 if required.

    Read the article

  • UAC-account-users can't see their mounted network-drives

    - by Daniel
    I wrote a few login batches in the Group Policy Management which mount specified devices to specified usergroups. The batches work as they should as long UAC is disabled. My problem is that the UAC-account-users can't see their mounted network-drives because the login scripts run in elevated context. I tried to fix the problem with PsExec (-l) so that the network-folders are mapped with limited user rigths. But it seems that this won't work. (PsExec is already installed on all computers so it can work local.) Has anyone an idea how to fix that problem? I spended a long time in trying to fix the problem but I did not find any solutions about THIS problem.

    Read the article

  • Which IMAP flags are reliably supported across most mail servers?

    - by Ben Butler-Cole
    I am writing an application which reacts to emails sent to a mailbox. It retrieves the emails via IMAP. It will be deployed to a number of systems where I do not control the mail server configuration. I would like to use IMAP flags to indicate which messages have been handled. Are the system flags sufficiently widely supported that I can reasonably depend on them in my application? Are user-defined flags sufficiently widely supported? (If the answer is "ha ha, not a chance", then I shall use folders instead.) Thanks -Ben

    Read the article

  • Notepad++ doesn't launch (notepad++.exe present in the task manager)

    - by Dsandre
    Yesterday Notepad++ was runnable, but today it doesn't. In spite of the notepad++.exe (with its 13 process) presence is the tasks manager, Notepad++ window doesn't launch. I tried to uninstall and re-install the sofware (and the first launch works), notepad.exe is present in C:\Windows and C:\Windows\Sytem32 folders. The last file opened with notepad++ was a xml file which isn't readable by other softwares because of the following error message : The '<' caracter can not be used in a value attribut What can I do please ?

    Read the article

  • scripsharp reference web service / strongly type to results model

    - by user175528
    With scriptsharp (script#) is it possible to get strong typing when calling a service defined in my web app? The only way I can see is to: 1 - use linked / shared files to shadow copy my results classes / domain models across into my script# lib 2 - replicate my model across in the script# lib and use automapper to validate? 3 - use some .tt to code gen? also, even if I can do this, how do I get around the auto camel-casing script# does, when my service result (asmx) wont do this? (so my JSON response will comback as UserMessage, script# will have changed that to userMessage) basically, what I am looking to use script# to achieve is better compile time support against our domain model when calling and processing services in javascript, so something like this: Scriptlet public static class MyScriptlet { public static void Main() { MyService.Service1("hello", ProcessResponse);} public static void ProcessResponse(MyService.Service1ResponseData resp) { jQuery.Select('#Message').Text(resp.UserMessage); jQuery.Select('#Detail').Text(resp.UserDetail); } Service (in our web app) public class MyService { public class Service1ResponseData { public string UserMessage {get;set;} public string UserDetail {get;set;} } public Service1ResponseData Service1(string user) { return new Service1ResponseData() { UserMessage:"hi",UserDetail:"some text"}; } }

    Read the article

  • Apache2 - setting PERL5LIB via SetEnv under CGI

    - by j0nes
    Hi, my setup is as follows: I have one Apache2 webserver running different vhosts, one vhost is for the production website, the other vhost is for a staging / preview system. Both vhosts have different DocumentRoots and also different (Perl) CGI folders. The used modules for each of these vhosts should be in different directories, so I did the following: <VirtualHost...> ServerName production SetEnv PERL5LIB /home/production/modules </VirtualHost> <VirtualHost...> ServerName staging SetEnv PERL5LIB /home/staging/modules </VirtualHost> However, I just noticed that in my Perl CGI scripts, both paths get filled into my @INC, so I can not separate the staging modules from the production modules, e.g. the SetEnv directive is not limited to a single virtual host, but seems to work globally. How can I solve this? Thanks! Jonas

    Read the article

  • Permission denied when copying on a fileshare in Finder, but copying via command line works

    - by smokris
    I'm trying to copy files on a SMB fileshare. When I attempt to copy the files in Finder, I get the following error: The operation can’t be completed because you don’t have permission to access some of the items. Copying via Terminal.app (using a simple cp command) works just fine. Permissions on the folders (as seen from the computer attached to the fileshare) are as follows: Source: dr-xr-x--- 2 smokris staff 16384 Oct 13 10:55 . dr-xr-x---@ 61 smokris staff 16384 Oct 13 10:56 .. -r--r----- 1 smokris staff 53970 Oct 13 10:55 ._IMG_3823.JPG -r--r-----@ 1 smokris staff 3135600 Oct 13 10:55 IMG_3823.JPG Destination: drwxrwx--- 2 smokris staff 16384 Apr 9 10:17 . drwxrwx--- 3 smokris staff 16384 Apr 9 10:15 .. Any ideas?

    Read the article

  • Folder redirection save times are awful slow

    - by wbmeu
    I recently set up folder redirection for Documents on Server 2008, but it's painfully slow at the moment. My users are all using Visual Studio 2010, and a save takes 20-30 seconds (whereas it used to take 2 seconds locally). I understand this is because they are being saved to the server, and that takes time (though I did think it would be faster over a gigabit link, with servers on the same network). I enabled offline files on the share, set the option to All files or folders, and enabled Optimize for performance. I thought that this would pull all the files down locally (which I think it did), allow local editing of said files, synchronizing them quietly in the background from time to time (which it does not do - saves right to the share). Is there any way I can speed this process up a bit? Any other tweaks I can do?

    Read the article

  • Vista machine hardwired cannot see xp laptop on wireless

    - by Kahega
    I have a laptop with windows xp, connected to my wireless router. I am trying to view the shared files on the laptop with my vista desktop computer. The desktop computer is hardwired to the same router. I can see the desktop pc fine on the xp laptop, and view the shared folders. I cannot see the xp laptop when on the desktop pc. I have tried installing the link update for xp, that is not the issue. I have also tried turning off firewalls, no dice. I have scoured google for this issue, and do not see any resolutions. Any suggestions would be appreciated.

    Read the article

  • Who should own the root folder of a drive?

    - by Gaia
    All partitions are NTFS. The system is Windows 7 Pro. It does not belong to a domain. I do use shared folders occasionally (both via the Homegroup and old school sharing). Should I set the owner to be Administrators or SYSTEM for a A) fixed drive? B) removable drive? C) Is it ok to make every object on the drive inherit the new ownership? I just realized that I had some messy settings because I turned UAC on for the first time in years and I am now getting some undesirable prompts. I already have permissions set properly and I am only concerned with ownership.

    Read the article

  • Nginx user subdomains, should I proxy_pass?

    - by Kevin L.
    I am trying to setup user subdomains, serving content from specific folders: www.example.com/username served from username.example.com (just like github pages). I've looked at Nginx rewrites, but I don't want the browser to redirect--I want the domain to be username.example.com. Anyway, a comment on this question says that I cannot rewrite host, only proxy to it. I tried to setup a proxy_pass, but all of the documentation and examples show it being used to (obviously) proxy to a service on another host or port, but in my case I want to just proxy to another location on the same host and port. Is this the appropriate way to tackle this problem, and if so, what is the right Nginx config syntax?

    Read the article

  • Connecting Snow Leopard 10.6.4 to a Linux shared folder using Samba

    - by Vittorio Vittori
    Hi, I'm trying to connect to a web server running on Linux CentOS 5.5 where I've shared a folder. I'm trying to connect to the directory with Snow Leopart 10.6.4 client without success. On CentOS I've started the Samba service and a Samba user with his password and then I've tried to connect to the server with the command smb://10.0.0.7 to reach the IP of the machine and then writing the username and password I've previously created. The server returns me the list of the shared folders with the leopard specific browser, when I click to the folder I want the browser returns this error (translated from Italian): Leopard message: Connection failed There was an error on connecting to "smb://10.0.0.7". Please verify the name or the IP of the server, and try again. How can I do to solve the connection problem?

    Read the article

  • IIS 7.5 401.3 Access Denied

    - by Jeffrey
    I am having this weird issue with IIS 7.5 on Windows 2008 R2 x64. I created a site in IIS and manually created a test file index.html and everything worked. When I try to do a deployment, I copy all the files from my local PC to the IIS server, try to access index.html (this is the proper deployed file) and getting 401.3 access denied error. I then try to manually recreate index.html and copy content into this newly created file and the page is accessible again... I just can't figure this out. So the issue is that IIS 7.5 can't server files that have been copied from other PCs. I tried to reset/apply permission settings to the copied folders/files but nothing has worked. Please help. Thanks! By the way, the files that I copied are just some html cutups i.e. generic html, css and image files, nothing special.

    Read the article

  • Automatically Snapshoting AWS instances (or other back up strategy)

    - by user1172468
    I just realized that my aws instance count has risen into the double digits. I'm currently backing portions of my folders and dbs and moving them off to a backup instance. What I think I should be doing is taking a snapshot of the instances (automatically) and persisting them on S3 so I have a running 7 day collection of daily backups. There is a question asking the same thing here, however the answers don't go into depth. So the closest answer seems to be: use a cron job to snapshot the instance. So do I run the cron job on the instance itself? or do I have a micro instance to run these snapshots? Could I get an example script or the command for say a linux flavor? what software must I have installed to get this to run? Thanks.

    Read the article

  • Unable to copy files previously extracted from archives created on a Mac, even after claiming ownership

    - by Maxim Zaslavsky
    I reinstalled Windows on my computer today, and backed up my music to a USB drive. Now, I'm trying to copy the files onto my fresh Windows partition, but I'm unable to copy files that I obtained within my previous Windows installation from zip archives created on Macs. When I try to copy those previously-extracted files, I get an error saying that I need permission from S-1-5-21-...-1000 (a bizarre long ID). The first thing I tried was to take ownership of the files by setting my new user account as the owner, but that resulted in errors saying that I need permission from myself! Some Googling suggested adding antivirus suggestions, so I excluded the relevant folders from Microsoft Security Essentials, but the issue persists. For what it's worth, it seems that some program (so far I've only installed Chrome, Microsoft Security Essentials, and the latest Windows updates) created an empty folder named 601c8c7f0e0c03f725 at the root of my external USB hard drive. What gives?

    Read the article

  • highcharts correct json input

    - by Linus
    i am trying to do a basic column chart. i have looked the examples but not sure why i do not see any graph (lines). I can see the title and subtitle appear an no javascript errors in firebug. any help please $(function () { var chart; $(document).ready(function() { chart = new Highcharts.Chart({ chart: { renderTo: 'container', type: 'column', events: { load: requestData } }, title: { text: 'Some title' }, subtitle: { text: 'subtitle' }, xAxis: { categories: [], title: { text: null } }, yAxis: { min: 0, title: { text: 'y-Axis', align: 'high' } }, tooltip: { formatter: function() { return ''+ this.series.name +': '+ this.y +' '; } }, plotOptions: { bar: { dataLabels: { enabled: true } } }, legend: { layout: 'vertical', align: 'right', verticalAlign: 'top', x: -100, y: 100, floating: true, borderWidth: 1, backgroundColor: '#FFFFFF', shadow: true }, credits: { enabled: false }, series:[] }); }); function requestData() { $.ajax({ url: 'test.json', success: function(data) { options.series[0].push(data); chart.redraw(); }, cache: false }); } }); my json input file is below [ { name: 'name1', y: [32.6,16.6,1.5] }, { name: 'name2', y: [6.7,0.2,0.6] }, { name: 'name3', y: [1,3.7,0.7] }, { name: 'name4', y: [20.3,8.8,9.5] },{ name: 'name5', y: [21.5,10,7.2] }, { name: 'name6', y: [1.4,1.8,3.7] }, { name: 'name7', y: [8.1,0,0] }, { name: 'name8', y: [28.9,8.9,6.6] } ]

    Read the article

< Previous Page | 132 133 134 135 136 137 138 139 140 141 142 143  | Next Page >