Search Results

Search found 15049 results on 602 pages for 'folder shortcut'.

Page 400/602 | < Previous Page | 396 397 398 399 400 401 402 403 404 405 406 407  | Next Page >

  • Crashplan Is Offering Free Yearly Plans [Black Friday]

    - by Jason Fitzpatrick
    Even if you’re eschewing Black Friday and all the shopping that goes with it, we’ve got a deal for you that’s too good to pass up: a free year of remote backup from CrashPlan. CrashPlan is running a fantastic Black Friday promotion. Starting at 6AM EST they’re offering all their plans for 100% off–we just picked up a family plan, normally $119 a year, for $0. Every two hours they’ll be incrementally decreasing the discount until Monday evening when the sale ends (even if you miss the early part of the sale the discount will still be 42% off come Monday). This is an absolute fantastic deal on a service everyone can use. If you followed along with our guide to using CrashPlan to backup your data at a friend’s house for free, this is a perfect time to add in a year of backup service to add another player to your backup routine. CrashPlan Why Does 64-Bit Windows Need a Separate “Program Files (x86)” Folder? Why Your Android Phone Isn’t Getting Operating System Updates and What You Can Do About It How To Delete, Move, or Rename Locked Files in Windows

    Read the article

  • Platform Builder: Removing the Version Information from the Desktop

    - by Bruce Eitman
    The question of how to remove the version information from the desktop has been around for a long time. It came up again today. The question is about the string displayed on the desktop that looks like one of these, depending on the OS verison: Windows Embedded CE v6.00 (Build xxxx on xxxx) Microsoft Windows CE v5.00 (Build xxxx on xxxx) Microsoft Windows CE .NET v4.20 (Build xxxx on xxxx) I have looked into this in the past, but never really had a definitive answer. I have an answer now. The short answer is that the version information is displayed if the code is built without SHIP_BUILD defined.  I have to be honest, I have given this answer in the newsgroups in the past, but I still had questions. My questions have come from different build machines giving different results.   I have noticed that some engineer’s workstations would have the version information displayed, while others did not. I always stopped short of spending time investigating further because our release build machines never resulted in the version information being displayed. But, we do not typically define SHIP_BUILD for our releases because our customers want or need the debug output. So today I dug further into the question. The answer is actually quite simple. Microsoft builds the retail shell libraries with SHIP_BUILD defined and releases the libraries with Platform Builder. Normally the source code does not need to be built during Sysgen, so the libraries that Microsoft delivered are linked to create the Explorer shell. So typically the Explorer shell displays the version information for debug builds, but does not for retail builds. The trouble comes when the source code is forced to be rebuilt for a retail build. This might happen if an engineer uses “Build and Sysgen” or builds the Public\Shell folder from the command line with the clean flag. I am not sure if Build and Sysgen will cause the problem or not – I have never used Build and Sysgen and I strongly advise against using it (see Platform Builder: Don’t use Build and Sysgen) Copyright © 2010 – Bruce Eitman All Rights Reserved

    Read the article

  • .NET Reflector & .NET Reflector Pro 6.1 have been released

    - by Bart Read
    .NET Reflector 6.1 and .NET Reflector Pro 6.1 have been released. You can download them from: http://www.red-gate.com/products/reflector/index.htm .NET Reflector is a class browser and disassembler for .NET assemblies. .NET Reflector Pro is a Visual Studio debugging extension that allows you to step through third party and framework assemblies, as if they were built from your own source code. This release fixes several problems that were present in the 6.0 release: Support for using a copy of Reflector.cfg stored alongside Reflector.exe has been re-enabled so users upgrading from 5.x releases will not lose their settings. Fixed unhandled exception on exit of Visual Studio when .NET Reflector add-in used in conjunction with TestDriven.NET add-in. Added better support for dealing with framework assemblies, which only contain meta-data, in the "Referenced Assemblies" folder. Fixed problem where attempted decompilation with CppCliLanguage add-in would lead to display of a page on the Red Gate website. Added option to activate .NET Reflector Pro to .NET Reflector menu in Visual Studio after receiving feedback from a number of users that it was hard to figure out how to activate the product. For more details about the products please visit: http://www.red-gate.com/products/reflector/index.htm

    Read the article

  • SMTP server on Win2008, SPF ecc

    - by Ronnie
    I want be able to send outgoing email from my website. I want to setup Win2008 smtp to be able to send them respecting all the spam rules. My checklist is: the smtp should be able to accept outgoing email only from internal sites: I will limit it to relay only 127.0.0.1 is this correct? it should have set SPF, DKIM and all the policy avaiable to not be marked as spam: how you would configure it for the internal SMTP? Should I use another kind of server like hMail server? I should be able to send like 30 email from each user session on the website without obliging the user to wait that the email is effectivly sent (I thought to save it on a folder and then use a batch to send them asynchronously). What are my options? What other steps would you add to be sure that the outgoing email is not marked as spam?

    Read the article

  • Can Windows-Security-SPP block execution of .exe?

    - by Kirk Marple
    We're seeing a strange situation, where some executables won't run from a Windows command prompt (running as admin). Just running the command (say, filename.exe) gives no response on the console. No errors, no output, nothing. If we copy over the same Windows .exe from a different folder, it "magically" starts working, and we see the default console output. (Happens both on Win7 x64, and Win2008R2 x64. Application is running as 32-bit process.) At the time when it accesses the .exe, I can see events in the application and system logs regarding Windows-Security-SPP, and it makes me believe that the .exe is being blocked from execution. Does this sound familiar?

    Read the article

  • Backup to Synology NAS using rsync or NFS and hardlinks

    - by danilo
    I want to backup data from a Windows (Vista) computer to a Synology NAS (210j). The NAS supports FTP, SMB, NFS and also allows a rsync daemon to be set up. I want to backup different folders to the NAS, but I'd prefer to use the hardlink method to save diskspace (like this script does). With this method, a new folder is created for every backup, but if the file already exists on the target, only a hardlink is created. The filesystem on the Synology device is ext3, so I probably can't use rsyncbackup, as it is made for NTFS. Is there another way to do this backup with hardlink support?

    Read the article

  • Whats consuming HDD Space

    - by Umair Mustafa
    I have single partition of 92GB in which I installed Ubuntu 12.04. And for some Unknown reason a message pop ups saying that I only have 1GB of HDD space left. I ran command sudo du -hscx * on / and /home /home gave me this result 4.0K C:\nppdf32Log\debuglog.txt 0 convertedvideo.avi 176M Desktop 16K Documents 169M Downloads 4.0K examples.desktop 17M file.txt 4.0K Music 984K Pictures 4.0K Public 320K Red Hat 6.iso 2.5M syslog-ng_3.3.6.tar.gz 4.0K Templates 8.0K terminal.png 1.2M Thunderbird Attachments 698M ubuntu10.04LTS.iso 16K Ubuntu One 4.0K Untitled Folder 4.0K Videos 21G VirtualBox VMs 22G total And / gave me this result 81G home 0 initrd.img 0 initrd.img.old 833M lib 16K lost+found 68K media 4.0K mnt 260M opt du: cannot access `proc/8339/task/8339/fd/4': No such file or directory du: cannot access `proc/8339/task/8339/fdinfo/4': No such file or directory du: cannot access `proc/8339/fd/4': No such file or directory du: cannot access `proc/8339/fdinfo/4': No such file or directory 0 proc 640K root 908K run 8.6M sbin 4.0K selinux 4.0K srv 0 sys 148K tmp 3.3G usr 436M var 0 vmlinuz 0 vmlinuz.old 86G total If you look at the result returned by / it shows that /home is consuming 81GB but on the other hand /home returns only 22GB. I cant figure out whats consuming the HDD. I have not installed anything except Virtual Machines Perpetrator found using Disk Usage Analyzer

    Read the article

  • Failed to load viewstate.The control tree into which viewstate is being loaded...etc

    - by alaa9jo
    Two days ago,a colleague of mine tried to publish an asp.net website (which is built in VS2008 using framework 3.5) to our server,he configured everything in IIS (he made sure that the selected asp.net version is 2.0) and launched the website..at first it was working great but when he tried to click on a specific treeview...BOOM..: "Failed to load viewstate. The control tree into which viewstate is being loaded must match the control tree that was used to save viewstate during the previous request. For example, when adding controls dynamically, the controls added during a post-back must match the type and position of the controls added during the initial request." In that page there were these control: a TreeView and a Placeholder,when the user selects any node then it's controls will be created dynamically into that placeholder..for the first time it's working fine but when (s)he select another node then that issue appears. He called me to help him with this issue,for me this is the first time I see such an issue,scratch my head then I decided to eliminate the possibilities of this issue one by one,at the development machine it's working perfectly,he published the website at the local IIS and again..it's working perfectly,I took a copy of the website and published it into my laptop but no issues at all,so this is means that it's not an issue in the code. So there is something missing/wrong in our server [it has Windows Server 2003],we went to the server and checked on the web-config and the configurations on IIS...nothing wrong so far,so I decided to check if the framework 3.5 is installed or not and the answer: it wasn't installed Of course he assumed that it was installed and there was nothing to tell if it wasn't from the "ASP.Net version" in IIS because frameworks 3.0 and 3.5 will not be listed there [2.0 will be listed there instead],the only way to check if it was installed or not is to search for the framework in this path:[WINDOWS Folder]\Microsoft.NET\Framework or check if it was installed in Add or remove programs. The obvious solution for his case: We installed Framework 3.5 SP1 into our server,did a restart to the machine and it worked ! If anyone faced the same issue and solved it using the same solution or with a different one please post it here to share experience.

    Read the article

  • Safari corrupting downloads?

    - by Kaji
    First off, a bit of background: I had to do an erase and install about 2-3 weeks ago, so this is a fresh, up-to-date installation of Snow Leopard we're dealing with. That said, I decided recently to branch out from simply programming PHP in a text editor and explore some of the other technologies I keep hearing about, and picked up Drupal, Joomla, and the Zend Framework from their respective official sites. Latest complete, stable builds for all 3. Drupal and Joomla downloaded without a problem, but when I put them in my /~username/Sites folder, XAMPP pretends they're not there, even if I restart Apache or the laptop itself. Zend's archive won't open at all. Is Safari corrupting the downloads, or are there other issues in play that can be investigated?

    Read the article

  • how to bypass internal DNS?

    - by fabjoa
    This is about Ubuntu but should be pretty much the same on all Linux flavors. Let's say I add an entry to my /etc/hosts such as 127.0.1.12 facebook.com and an Apache virtual host such as <VirtualHost 127.0.1.12> ServerName facebook.com DocumentRoot /var/www </VirtualHost> when i open my browser and send a GET request to facebook.com, firefox will browse my /var/www folder. Question: How could I fetch (ie, using wget in bash) the real facebook.com domain - without erasing the entry in /etc/hosts nor my Apache VirtualHost -- IOW how could I bypass internal DNS?

    Read the article

  • Automatically Login & Startup A Windows Program On Amazon's EC2 Service

    - by darkAsPitch
    How can I automatically start a program on Amazon's EC2 Windows 2008 web servers? For example, if I wanted to test the "Digg effect" on a web page of mine, how could I open 100 windows 2008 servers at once, each loading one (or two?) instances of the firefox web browser? I have placed a sample batch file in the windows startup folder that echos the time it was called, but it is only started when I actually login remotely via the remote desktop protocol. I don't want to have to login to 100 servers to get my software to run :P What can I do? I am using this Windows 2008 Datacenter, Amazon-supplied AMI specifically: ami-a2698bcb

    Read the article

  • Error "fileid changed" when accessing files over NFS

    - by Roman Prikhodchenko
    I have an nfs-kernel-server configured and running on Ubuntu 10.04 Server. /export THIRD_SERVER_IP(rw,fsid=0,insecure,no_subtree_check,async) SECOND_SERVER_IP(rw,fsid=0,insecure,no_subtree_check,async) /export/ebs THIRD_SERVER_IP(rw,fsid=0,insecure,no_subtree_check,async) SECOND_SERVER_IP(rw,nohide,insecure,no_subtree_check,async) I mounted the exported folder to the second server: mount -t nfs4 -o proto=tcp,port=2049 NFS_SERVER_IP_HERE:/ebs /ebs and it works just fine. I mounted it to the third server but I cannot access files from it. ls -l /ebs ls: reading directory /ebs: Stale NFS file handle total 0 The syslog on the third server says: kernel: [11575.483720] NFS: server NFS_SERVER_IP_HERE error: fileid changed kernel: [11575.483722] fsid 0:14: expected fileid 0x2, got 0x6e001 Some info: uname -r 2.6.32-312-ec2 uname -m i686

    Read the article

  • Collectd on ubuntu with perl plugin support

    - by Roman
    For days I am struggling with enabling perl plugin support for collectd. I have installed colllectd 5.4.0 on a Aws ubuntu 13.04. Configured compiled. I have even installed libperl-dev. But when i run ./configure from collectd installation , it still says that "perl ....(needs libperl)" Now enabling the perl plugins from collectd.conf didnt help much. In logs i see that : plugin_load: Could not find plugin "perl" in /opt/collectd/lib/collectd and indeed there is not perl.so or whatever in that folder. Can someone help me out with that ?

    Read the article

  • mac osX file recovery

    - by Daniel
    I thought that all operating systems would merge folder content when being moved to the same location. Imagine my surprise when that didn't happen and I have hundreds, if not thousands of files that have gone missing and are nowhere to be found. Because they were not "deleted" they are not in the trash bin. I've tried to do some recovery using a program called stellarPheonix but after about a 24hour scan, it didn't recognize any of the raw files (.dng,.arw) as image files and so I couldn't see if they could be recovered. It also didn't show the directory structure, which would be handy. I tried a quick scan, but all it showed was files that were still on the HD, not sure what the point of that is. I've used recover 2000 on Win and it does a good job, does anyone know of anything that works quickly and reliably for this kind of file recovery. (I don't think I should have to do a sector-by=sector for this kind of file loss)

    Read the article

  • Mimic the behavior of a machine added to a domain

    - by Ian
    Hello, For some reason, the IT department at our company does not want to add Windows 7 and Windows Vista machine to the domain controller. I hate to always provide my network credentials everytime I access a shared folder on a machine that is joined to the domain. I also hate to always provide my password when I launch outlook or Visual Studio (Team Explorer). Is there a way to mimic the behavior of a machine that is added to a domain without actually adding the machine in the domain? For shares, I can create a batch file that will NET USE the different fileservers we use here but that is a huge security risk as I will type my password as plaintext. Thanks!

    Read the article

  • How can I rewrite / redirect URL's in Glassfish V3?

    - by Jin Liew
    Hi, I'd like to simplify the URL's to access a Glassfish V3 application by removing file extensions and otherwise shortening URL's. I've already set my application as the default application, so that there is no need to include the context root in the URL. I'd like to: * Remove file extensions * Shorten the URL to files deep in the folder structure I'd like to do this using pattern matching rather than on a per file basis (Site is small at the moment but will change frequently and grow). Some examples of what I'd like to do: * foo.com/bar.html - foo.com/bar * foo.com/folder1/folder2/bar2.html - foo.com/bar2 Any help would be greatly appreciated. Thanks. Cheers, Jin

    Read the article

  • sSMTP Unable to send message using external mail server SMTP

    - by OrangeGrover
    I'm trying to finish up my Nagios install by having it email me. It was emailing me using /bin/mail so it always got sent to my spam folders. I installed sSMTP to try to send a request to my work's email server to be able to send out a message from an authenticated user. Here is my /etc/ssmtp/ssmtp.conf file: mailhub=10.200.120.148:25 UseTLS=NO AuthUser= [email protected] AuthPass=PASSWORD So far I've been using the following command, and it will still arrive to my email inbox as root@localhost which causes it to go to my spam folder (with the exception of one email provider I have). cat message |ssmtp [email protected] I've looked at a few examples online, and they all seem to have pretty much the same as me. Does anybody see the any mistakes that I'm making? Just to clarify, [email protected] is a user on the mail server that my work uses.

    Read the article

  • Help with a simple incremental backup script

    - by Evan
    I'd like to run the following incomplete script weekly in as a cron job to backup my home directory to an external drive mounted as /mnt/backups #!/bin/bash # TIMEDATE=$(date +%b-%d-%Y-%k:%M) LASTBACKUP=pathToDirWithLastBackup rsync -avr --numeric-ids --link-dest=$LASTBACKUP /home/myfiles /mnt/backups/myfiles$TIMEDATE My first question is how do I correctly set LASTBACKUP to the the the directory in /backs most recently created? Secondly, I'm under the impression that using --link-desk will mean that files in previous backups will not will not copied in later backups if they still exist but will rather symbolically link back to the originally copied files? However, I don't want to retain old files forever. What would be the best way to remove all the backups before a certain date without losing files that may think linked in those backups by currents backups? Basically I'm looking to merge all the files before a certain date to a certain date if that makes more sense than the way I initially framed the question :). Can --link-dest create hard links, and if so, just deleting previous directories wouldn't actually remove linked file? Finally I'd like to add a line to my script that compresses each newly created backup folder (/mnt/backups/myfiles$TIMEDATE). Based on reading this question, I was wondering if I could just use this line gzip --rsyncable /backups/myfiles$TIMEDATE after I run rsync so that sequential rsync --link-dest executions would find already copied and compressed files? I know that's a lot, so many thanks in advance for your help!!

    Read the article

  • Problems with mailenable when sending to yahoo mail

    - by Mee
    I'm testing sending emails from mailenable webmail. I have no problems sending mail to gmail or hotmail, both work fine, but yahoo mail sends my messages to the spam folder and shows the attachment icon for the message even though the message doesn't contain any attachments, it's just plain text. It only includes a reply to a previous message, like this: message text ----- Original Message ----- original message text I copied the message content and sent it from gmail to yahoo and the attachment icon didn't show which makes me believe it's something with mailenable. What could possibly be wrong? Also, is there a white list for yahoo mail that I can join? And also for other popular webmail? I'm going to use this on a production website (site visitors use the contact us form to send messages to the site - the mail enable server running on the same machine as the web server - then I check the messages using the mailenable webmail and reply them). This is really important to me, your help would be really appreciated ...

    Read the article

  • Nautilus crashes after Ubuntu Tweak Package Cleaner [fixed]

    - by Ka7anax
    Few days ago I started having some problems with nautilus. Basically when I'm trying to get into a folder it crashes. It's not happening all the time, but in 85% it does... Sometimes, after the crash all my desktop icons are also gone. The only thing that I think causes this is Ubuntu Tweak - I'm not sure, but the issues started after I did the Package cleaner from Ubuntu Tweaks... Any ideas? ------- EDIT 2 - IMPORTANT !!! ---------- It seems I fixed this problem doing these: 1) I uninstall this nautilus script - http://mundogeek.net/nautilus-scripts/#nautilus-send-gmail 2) I installed nautilus elementary So far is back to normal... If anything bad happens again I will come back! -------- EDIT 1 ---------- First time, after running the command (nautilus --quit; nautilus --no-desktop) 3 times all the system crashed (except the mouse, I could move the mouse). After restart I run it and obtain this: ----- Initializing nautilus-gdu extension Initializing nautilus-dropbox 0.6.7 (nautilus:2966): GConf-CRITICAL **: gconf_value_free: assertion value != NULL' failed (nautilus:2966): GConf-CRITICAL **: gconf_value_free: assertionvalue != NULL' failed Nautilus-Share-Message: Called "net usershare info" but it failed: 'net usershare' returned error 255: net usershare: cannot open usershare directory /var/lib/samba/usershares. Error No such file or directory Please ask your system administrator to enable user sharing. and then this: cristi@cris-laptop:~$ nautilus --quit; nautilus --no-desktop (nautilus:3810): Unique-DBus-WARNING **: Error while sending message: Did not receive a reply. Possible causes include: the remote application did not send a reply, the message bus security policy blocked the reply, the reply timeout expired, or the network connection was broken.

    Read the article

  • Unattended Windows XP Install Stops at Deleting Previous Installation

    - by maik
    I'm not sure if I'm just not asking Google properly or what, but I can't come up with a good answer to this problem. We have MDT 2010 setup and have a Task Sequence for refreshing Windows XP machines. It doesn't seem to happen all the time, but a lot of the time when we start a refresh it goes through the normal motions and when it gets into the first part of Windows XP setup (the blue screen) it stops, telling me a Windows installation already exists at that location and I can press L to continue, erasing everything and using that folder. I've poured over the unattend file and can't find an option that will just delete the old files and keep going, so I'm at a loss. Any ideas?

    Read the article

  • Microsoft VirtualPC installation

    - by Sergey Osypchuk
    I am trying to run old win16 application. I am running windows 7 x64 SP1. I downloaded VirtualPC from http://www.microsoft.com/windows/virtual-pc/download.aspx Step 2 and Step 3. During installation of Windows Virtual PC i am having error in event log: {Cannot install widnows update because of error} 2149842967 "" (Command Line: ""C:\Windows\system32\wusa.exe" "C:\Users\Sergey\Downloads\Windows6.1-KB958559-x64-RefreshPkg (1).msu" Text in { } is approximate translation from russian to english. When i try to run "Windows XP Mode", it says: Cannot launch main process Windows Virutual PC When i click "Windows Virtual PC" it shows empty folder. Any ideas?

    Read the article

  • .htaccess - permissions forbidden

    - by user1732521
    I have an error with a new virtual host that I can't figure out.. My .htaccess doesn't have web access (403). [Thu Oct 31 17:51:01 2013] [crit] [client ] (13)Permission denied: /srv/data_disk/www /site.dev/.htaccess pcfg_openfile: unable to check htaccess file, ensure it is readable I have set the permissions for the complete htdocs folder to 755, and to owned by my regular user and group (www-data). I have other vhosts set up with the same user and lesser permissions (rw-rw---) on the .htaccess. The virtual hosts are also setup in the same way.. as far a I can tell.. Thanks!

    Read the article

  • Unable to SSH to EC2

    - by Walker
    I downloaded the cert-xxx.pem and pk-xxx.pem files and also the keypair.pem and moved it all to the /.ssh folder on my Ubuntu client machine. this is what I get when I try to SSH with -v at the end debug1: SSH2_MSG_SERVICE_ACCEPT received debug1: Authentications that can continue: publickey debug1: Next authentication method: publickey debug1: Trying private key: /root/.ssh/identity debug1: Trying private key: /root/.ssh/id_rsa debug1: Trying private key: /root/.ssh/id_dsa debug1: No more authentication methods to try. Permission denied (publickey). I am new to administering servers and I want to know if I should be trying to convert the pem files to id_rsa and id_dsa. I am not really sure if that is possible but I don't know how else to get the id_rsa, id_dsa from those pem files or if there is any work around. I managed to get access to EC2 the first time and this is my second try and I am unsuccessful so far. Any help is appreciated. regards Walker

    Read the article

  • Limit vsftp upload to a given set of file-names

    - by Chen Levy
    I need to configure an anonymous ftp with upload. Given this requirement I try to lock this server down to the bear minimum. One of the restrictions I wish to impose is to enable the upload of only a given set of file-names. I tried to disallow write permission to the upload folder, and put in it some empty files with write permission: /var/ftp/ [root.root] [drwxr-xr-x] |-- upload/ [root.root] [drwxr-xr-x] | |-- upfile1 [ftp.ftp] [--w-------] | `-- upfile2 [ftp.ftp] [--w-------] `-- download/ [root.root] [drwxr-xr-x] `-- ... But this approach didn't work because when I tried to upload upfile1, it tried to delete and create a new file in its' place, and there is no permissions for that. Is there a way to make this work, or perhaps use a different approach like abusing the deny_file option?

    Read the article

< Previous Page | 396 397 398 399 400 401 402 403 404 405 406 407  | Next Page >