Search Results

Search found 11618 results on 465 pages for 'shared storage'.

Page 272/465 | < Previous Page | 268 269 270 271 272 273 274 275 276 277 278 279  | Next Page >

  • ubuntu input/output error

    - by rplevy
    I'm having a problem with Ubuntu that I'm finding hard to troubleshoot for reasons that will become clear: reboot -bash: /sbin/reboot: Input/output error dmesg -bash: /bin/dmesg: Input/output error ps -e ps: error while loading shared libraries: /lib/libproc-3.2.8.so: cannot read file data: Input/output error lsof -bash: /usr/bin/lsof: Input/output error fsck -bash: /sbin/fsck: Input/output error badblocks -bash: /sbin/badblocks: Input/output error So I can't see what is going on, and I can't remotely reboot. What can I do to get to the bottom of this? Interestingly: init 0 Segmentation fault I can cat /var/syslog but not /var/log/messages or several other important files. less and more don't work, neither do tail or head, etc.

    Read the article

  • Power User - archive Outlook mail items into SQL Server

    - by marc_s
    I am looking (and so far not finding any) for a solution to archive e-mail items from my Outlook into SQL Server. My PST is beginning to get really really big, and I'd love to extract my older e-mail into SQL Server in a way so I can still easily find mails if needed. I would prefer SQL Server as the storage medium since I'm familiar with it, and it's rock solid - I don't want to have a collection of PST files or CHM files or anything like that. Does anyone know of such a solution? I'm a power/home user - I can't afford $5'000 enterprise licenses - I need a sub-$100 solution for private use.

    Read the article

  • Windows 7 can't make back up

    - by J. Pablo Fernández
    I've been trying to get Windows 7 to make a backup for a week or so. I'm backing up to a local NAS and I'm getting this error: Windows Backup: Troubleshooting Options Check your backup Windows Backup could not create a zip file. This could be because the drive that Windows is installed on does not have enough space or it could be a temporary error. Make sure you have at least 400 MB of free space and try again. Backup time: 2009-09-07 14:48 Backup location: \VANGELIS\Shared\Backup\lennon\ Error code: 0x81000015 My local hard disk has 290GB free and my NAS has 200GB free. Any ideas what might be wrong?

    Read the article

  • SQL 2008 R2: Data\Log partions

    - by Reese Hirth
    I have a SQL Server setup that a previous IT person set up with a 2TB data partition and a 1TB log partition. The OS partition is 244GB and SQL is installed on a separate 1TB partition. We have an additional 8TB of storage that I would like the new IT staff to bring on line. He wants to create 4 new 2TB data partition. I see this as confusing. Can't we just backup the current data partition, blow it away, and create a new 10TB data partition I'm responsible for administering the data on the server but am not allowed to do the setup myself. This is a GIS server running ArcGIS Server with around 60 geodatabases ranging from 20BG to a couple that may grow to over a TB. So, 5-2TB data partitions or 1-10TB partition. Thanks for the advice.

    Read the article

  • Unable to connect to another computer from the Task Scheduler on Windows 7

    - by Clem
    I am getting the following error when trying to connect to another computer from the Task Scheduler on Windows 7: "The remote computer was not found." The computer that I am trying to connect to is definitely on the network as I can ping it and browse its shared folders in Windows Explorer. Note that I get the same error message when trying to perform the same operation from Performance Monitor. This suggests that I need to something to enable remote connection to the Task Scheduler. I am not very experienced with Windows administration and I am not sure where to look. To give a bit more context, I want to use the Task Scheduler to automatically start Perf Mon on a few machines at my company. I'd like to setup the Task Scheduler remotely. Does anyone know what I need to do?

    Read the article

  • A network share folder is invisible to users

    - by Myrddin Emrys
    I have a network share folder that I was recently cleaning up permissions to. I took off the four individual names from the access permissions to the folder, and added a new security group (Universal) with standard Read/Write permissions to that folder, then added those 4 people to the group. However... now nobody can see the folder. The users can see the other 9 folders in that shared drive, but the 10th is missing. I cannot see any security permission in the parent folder or in the folder itself which would cause it to be invisible to anyone, regardless of whether they have permission to open it or edit files within.

    Read the article

  • Which AMI should I use as a base for a Django application?

    - by Edan Maor
    I'm starting development of a Django application, on Amazon's Web Services. I'm looking to build an instance that will serve the Django. I don't have much experience with such things, having only used a shared host before (WebFaction). So I'm wondering, which AMI should I use as a base? I'm assuming I want an Ubuntu AMI, possibly with certain things like Apache pre-installed? One minor point: I'm planning to serve several different Django projects from the same instance. I use virtualenv on my dev machine right now to separate the different projects, I'm assuming I'll do the same on EC2. Thanks!

    Read the article

  • Why does DBAN crash on my HDDs?

    - by John Watson
    I am using DBAN to erase HDD. DBAN is loaded from a CD and BIOS Boot order has been set to favour CD drive. On starting laptop, system boots from CD and DBAN interface can be seen. DBAN detects two storage devices, HDD and the SD Card. My HDD IS 320GB but DBAN says 298GB. It erases the SD card but when i try to erase HDD, it gives following error. DBAN finished with non-fatal errors. *ERROR /dev/sdb (process crash) *ERROR /dev/sda (process crash)

    Read the article

  • Starting with administering a linux box

    - by Josh K
    I'm rather new to this, and I'd like to play with administering a linux box. Things I need to know how to do: Setup subdomains Setup FTP accounts Setup full domains / add domains MySQL setup / install / management LAMP setup / install / management This is probably going on a CentOS distro. I'd like links or a break down on how I can learn to do this. I am comfortable with a command line, but I'm trying to move from shared hosting to a VPS and would like to have some idea on how deep the water is before I do.

    Read the article

  • Authorization error when testing FTP to UNC

    - by user64204
    We have a Windows Server 2008 R2 with Active Directory (hereafter called DC) running as a domain controller on which we have IIS and an FTP site installed. We have a second Server 2008 (hereafter called SHARE) which is joined to that domain and has a disk shared as a network share (\\share\Office). That network share is used as the ftp's physical path on DC. We've tested the FTP from the IIS FTP configuration panel, by clicking on Basic Settings... then Test Settings.... When setting Administrator as a username with the Connect as... option, everything is fine: When no user is provided we can the below error: Q1: Could someone explain in more understandable terms what is written in the Details text area?

    Read the article

  • I've created a software RAID on SUSE EL 10 and now I need to monitor it.......

    - by Thomas B.
    I have created a Software Raid using the yast2 GUI on SUSE ES 10/11. The raid works great and it's a raid 5. I have 5 Drives they are cheap 2GB Cases that have 2 - 1TB Drives in each case (Serial ATA Drives) and I connect them in via Esata to the motherboard. The problem I have as this is "cheap" storage when of the the 5 drives goes out on the RAID I seem to have no logs of any issues and it get's harder and harder to write to it until it dies. I use SAMBA to mount the 4TB parition to my PC's in my home on a GIG network. My question is this, are there any good (Free) tools in Linux to monitor a raid or the drives on the raid to detect any problems??? I haven't found any yet and was just wondering if some exist.

    Read the article

  • Fabric and cygwin don't work with windows UNC paths

    - by tcoopman
    I have some strange problems with fabric deployment to Windows Server 2008r2. The thing I try to accomplish is to copy some files to a shared folder with a fabric script (this script does a lot of other things too, but only this step gives me problems). This is the problem: When I try to access a UNC(Universal Naming convention) path I always get access denied kind of answers if I run the script in fabric. When I run the command in an ssh prompt (same user) it works fine. Examples: cmd: robocopy f:/.... //share result: in ssh this works fine, in fabric I get "Logon failure: the user has not been granted the requested logon type aat this computer." cmd: cd //share result: in ssh this works fine, in fabric I get "//share: Not a directory" Further information: uname -a and whoami return exact the same thing in fabric and ssh. I also tried things like mount, net use, but these commands all have kind of the same problem.

    Read the article

  • File Upload drops with no reason

    - by sufoid
    Hallo I want to make an file upload. The script should take the image, resize it and upload it. But it seems that there is any unknown to me error in the upload. Here the code define ("MAX_SIZE","2000"); // maximum size for uploaded images define ("WIDTH","107"); // width of thumbnail define ("HEIGHT","107"); // alternative height of thumbnail (portrait 107x80) define ("WIDTH2","600"); // width of (compressed) photo define ("HEIGHT2","600"); // alternative height of (compressed) photo (portrait 600x450) if (isset($_POST['Submit'])) { // iterate thorugh all upload fields foreach ($_FILES as $key => $value) { //read name of user-file $image = $_FILES[$key]['name']; // if it is not empty if ($image) { $filename = stripslashes($_FILES[$key]['name']); // get original name of file from clients machine $extension = getExtension($filename); // get extension of file in lower case format $extension = strtolower($extension); // if extension not known, output error // otherwise continue if (($extension != "jpg") && ($extension != "jpeg") && ($extension != "png") && ($extension != "gif")) { echo '<div class="failure">Fehler bei Datei '. $_FILES[$key]['name'] .': Unbekannter Dateityp: Es können nur Dateien vom Typ .gif, .jpg oder .png hochgeladen werden.</div>'; } else { // get size of image in bytes // $_FILES[\'image\'][\'tmp_name\'] >> temporary filename of file in which the uploaded file was stored on server $size = getimagesize($_FILES[$key]['tmp_name']); $sizekb = filesize($_FILES[$key]['tmp_name']); // if image size exceeds defined maximum size, output error // otherwise continue if ($sizekb > MAX_SIZE*1024) { echo '<div class="failure">Fehler bei Datei '. $_FILES[$key]['name'] .': Die Datei konnte nicht hochgeladen werden: die Dateigröße überschreitet das Limit von 2MB.</div>'; } else { $rand = md5(rand() * time()); // create random file name $image_name = $rand.'.'.$extension; // unique name (random number) // new name contains full path of storage location (images folder) $consname = "photos/".$image_name; // path to big image $consname2 = "photos/thumbs/".$image_name; // path to thumbnail $copied = copy($_FILES[$key]['tmp_name'], $consname); $copied = copy($_FILES[$key]['tmp_name'], $consname2); $sql="INSERT INTO photos (galery_id, photo, thumb) VALUES (". $id .", '$consname', '$consname2')" or die(mysql_error()); $query = mysql_query($sql) or die(mysql_error()); // if image hasnt been uploaded successfully, output error // otherwise continue if (!$copied) { echo '<div class="failure">Fehler bei Datei '. $_FILES[$key]['name'] .': Die Datei konnte nicht hochgeladen werden.</div>'; } else { $thumb_name = $consname2; // path for thumbnail for creation & storage // call to function: create thumbnail // parameters: image name, thumbnail name, specified width and height $thumb = make_thumb($consname,$thumb_name,WIDTH,HEIGHT); $thumb = make_thumb($consname,$consname,WIDTH2,HEIGHT2); } } } } } // current image could be uploaded successfully echo '<div class="success">'. $success .' Foto(s) erfolgreich hochgeladen!</div>'; showForm(); // call to function: create upload form }

    Read the article

  • my sweet old VHS collection

    - by microspino
    Which is the best procedure and digital format to resurrect my old VHS library i a way I can see It on my LCD TV? I have a not so big collection 100 VHS I have plenty of storage I have a network media tank (A110 popcorn Hour but I can also purchase a new media center if needed) I have an old working VCR (but again I can pick a specific one new if you think It's better to save quality) The VHS cassette collection seems to have retained a good quality over the years. Of course I have some computer (either mac and pc) to do the process. Which software do I need/miss? Please give me some advice.

    Read the article

  • Setting up NIS/NFS on Mac OS 10.6

    - by evan
    We have an Ubuntu NIS/NFS server at work and we recently got a few new iMacs. Is there a way to set them up so they can use the linux user accounts and mount the shared nfs files? Are there any guides on how to do this? I've been googling with no success. I tried getting NFS to work by connecting to the server via the Disk Utility but after I run 'sudo automount' from the command line and ls the directory I tried to mount it to (Volumes/nfs) it gives a permissions error. If there isn't a way to do this, anyone know of any not to complicated ways to share user accounts and files between mac and linux computers (and even hypothetically a windows computer one day?) I know its kind a of huge question, but I'll greatly appreciate any advice on the topic. Thanks!

    Read the article

  • use network drives as mount points during installation?

    - by ajsie
    is it possible to use network storage locations as mount points during installation? cause i want to separate system (ubuntu) with data (personal files). eg. if i have 5 computers i don't want to recreate /home/david 5 times. so i want to mount networkdrive/home to /home in local ubuntu server. so ALL users home folders could be used and maybe also networkdrive/projects to /projects. in that way its ok if i by accident repartitioned the local ubuntu server cause all data is not there on that server, but in the data server. is separating "data" from "logic" good in this case? and is it possible? what protocol should i use for the mapping over internet? (maybe the server is in Sweden, and the data is in Norway). thanks.

    Read the article

  • Permission forbidden on localhost with apache2

    - by N Alex
    Here is what I am trying to do. I tried to add another folder to apache and I get the following error when trying to acces testing/index.html. The idea is that I would like to have for every customer a folder like /home/neagoe/Work/InterWebs/Projects/[PROJECT NAME]/CustomerProjects/website/dist. Forbidden You don't have permission to access /index.html on this server. Apache/2.2.22 (Ubuntu) Server at testing Port 80 Here are the steps that I followed: Step1: sudo chmod a+x /home/neagoe/Work/InterWebs/Projects/testing/CustomerProjects/website/dist Step2: sudo chown -R www-data:www-data /home/neagoe/Work/InterWebs/Projects/testing/CustomerProjects/website/dist sudo chmod -R 775 /home/neagoe/Work/InterWebs/Projects/testing/CustomerProjects/website/dist Step3: sudo adduser $USER www-data Step4: sudo a2enmod userdir Step5: sudo cp /etc/apache/sites-available/default /etc/apache/sites-available/testing I edited the file /etc/apache/sites-available/testing so it looks like this: <VirtualHost *:80> ServerAdmin webmaster@localhost ServerName testing DocumentRoot /home/neagoe/Work/InterWebs/Projects/testing/CustomerProjects/website/dist <Directory /> Options FollowSymLinks AllowOverride None </Directory> <Directory /home/neagoe/Work/InterWebs/Projects/testing/CustomerProjects/website/dist/ > Options Indexes FollowSymLinks MultiViews AllowOverride All Order allow,deny allow from all </Directory> ScriptAlias /cgi-bin/ /usr/lib/cgi-bin/ <Directory "/usr/lib/cgi-bin"> AllowOverride None Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch Order allow,deny Allow from all </Directory> ErrorLog ${APACHE_LOG_DIR}/error.log # Possible values include: debug, info, notice, warn, error, crit, # alert, emerg. LogLevel warn CustomLog ${APACHE_LOG_DIR}/access.log combined </VirtualHost> Step6: I edited hosts ("/etc/hosts") so it looks like this: 127.0.0.1 localhost 127.0.0.1 testing # The following lines are desirable for IPv6 capable hosts ::1 ip6-localhost ip6-loopback fe00::0 ip6-localnet ff00::0 ip6-mcastprefix ff02::1 ip6-allnodes ff02::2 ip6-allrouters Step7: sudo a2ensite testing sudo service apache2 restart I searched for about 2 hours on the internet but I can't figure out what went wrong. All the pages that I found following the same steps as described above. I know there are similar questions here on the internet, but the answer is to change permission to the directory which I did on Step2. I am sorry if this is really a duplicate but I could't find the right answer. Thank you! PS. I asked this also on AskUbuntu but didn't get any answers so I'm trying my luck here. Edit: There isn't much on the error log or the access log. On the access.log: ::1 - - [10/Aug/2013:11:23:28 +0300] "OPTIONS * HTTP/1.0" 200 126 "-" "Apache/2.2.22 (Ubuntu) (internal dummy connection)" ::1 - - [10/Aug/2013:11:23:29 +0300] "OPTIONS * HTTP/1.0" 200 126 "-" "Apache/2.2.22 (Ubuntu) (internal dummy connection)" ::1 - - [10/Aug/2013:11:23:31 +0300] "OPTIONS * HTTP/1.0" 200 126 "-" "Apache/2.2.22 (Ubuntu) (internal dummy connection)" ::1 - - [10/Aug/2013:11:23:32 +0300] "OPTIONS * HTTP/1.0" 200 126 "-" "Apache/2.2.22 (Ubuntu) (internal dummy connection)" ::1 - - [10/Aug/2013:11:23:33 +0300] "OPTIONS * HTTP/1.0" 200 126 "-" "Apache/2.2.22 (Ubuntu) (internal dummy connection)" ::1 - - [10/Aug/2013:11:23:34 +0300] "OPTIONS * HTTP/1.0" 200 126 "-" "Apache/2.2.22 (Ubuntu) (internal dummy connection)" ::1 - - [10/Aug/2013:11:23:35 +0300] "OPTIONS * HTTP/1.0" 200 126 "-" "Apache/2.2.22 (Ubuntu) (internal dummy connection)" 127.0.0.1 - - [10/Aug/2013:11:23:23 +0300] "POST /wordpress-testing/wp-cron.php?doing_wp_cron=1376123003.7026669979095458984375 HTTP/1.0" 200 705 "-" "WordPress/3.6; http://localhost/wordpress-testing" ::1 - - [10/Aug/2013:11:23:36 +0300] "OPTIONS * HTTP/1.0" 200 126 "-" "Apache/2.2.22 (Ubuntu) (internal dummy connection)" ::1 - - [10/Aug/2013:11:23:37 +0300] "OPTIONS * HTTP/1.0" 200 126 "-" "Apache/2.2.22 (Ubuntu) (internal dummy connection)" ::1 - - [10/Aug/2013:11:23:38 +0300] "OPTIONS * HTTP/1.0" 200 126 "-" "Apache/2.2.22 (Ubuntu) (internal dummy connection)" 127.0.0.1 - - [10/Aug/2013:11:31:32 +0300] "GET /index.html HTTP/1.1" 200 485 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:23.0) Gecko/20100101 Firefox/23.0" And the last line repeats for about 200 rows. On the error.log: 1. This lines repeat from time to time. PHP Warning: PHP Startup: Unable to load dynamic library '/usr/lib/php5/20100525 /msql.so' - /usr/lib/php5/20100525/msql.so: cannot open shared object file: No such file or directory in Unknown on line 0 [Sat Aug 10 13:06:42 2013] [notice] Apache/2.2.22 (Ubuntu) PHP/5.4.9-4ubuntu2.2 configured -- resuming normal operations [Sat Aug 10 13:07:36 2013] [notice] caught SIGTERM, shutting down PHP Warning: PHP Startup: Unable to load dynamic library '/usr/lib/php5/20100525/msql.so' - /usr/lib/php5/20100525/msql.so: cannot open shared object file: No such file or directory in Unknown on line 0 [Sat Aug 10 13:07:37 2013] [notice] Apache/2.2.22 (Ubuntu) PHP/5.4.9-4ubuntu2.2 configured -- resuming normal operations 2. And this is the predominant error. (hundreds of lines) [Sat Aug 10 13:07:40 2013] [error] [client 127.0.0.1] (13)Permission denied: access to /index.html denied

    Read the article

  • Routing traffic to specific web sites through Ethernet, rest via wifi on Mac OS X 10.6?

    - by user32448
    Hi I have two separate Internet connections connected to a Mac and I'd like one of them (via Ethernet eth0 gateway 192.168.2.1) to serve for just backing up to an remote online storage, and the other one (via Airport en1 gateway 192.168.1.1) for all other Internet traffic. I tried using "route" from the terminal as follows: sudo route add -host 98.207.226.113 -interface eth0 (just for testing against the site www.whatismyip.org whose IP is 98.207.226.113, to see through which gateway the traffic is routed) I can see using netstat that the route is added: $ netstat -rn -f inet Routing tables Internet: Destination Gateway Flags Refs Use Netif Expire default 192.168.1.1 UGSc 49 0 en1 98.207.226.113 192.168.2.1 UGSc 0 0 eth0 However, the traffic in this case does NOT get routed properly through Ethernet, as if the routing definition I made is ignored. Any ideas? Thanks!

    Read the article

  • Windows Server 2008: Terminal Services / VDI

    - by JohnyD
    I have a Dell R710 with 72GB of memory running Hyper-V. Within Hyper-V I have a Windows 2008 (32-bit) VM running Terminal Services. How do I allocate memory so that any user who connects to this Terminal Server (from their thin-client) is allocated 2GB (or whatever amount I choose) of memory? Currently I have provisioned the TS with 2GB of memory but it seems that this is shared among all that connect. Please let me know if there is further information I can provide. Thank you. Update 1: What I'm looking to accomplish with this server is setting up a VDI to allow users to connect from thin-clients from within our network. They will also have to connect from outside our network via VPN which is already in place. Am I able to set this up using Windows Server 2008 (not R2) because I have a 16-bit application which needs to be supported. Unfortunately it's not a candidate as a Remote App.

    Read the article

  • Symmetrix gatekeepers on Solaris 10

    - by Milner
    I have some Solaris machines that are connected to EMC Symmetrix for SAN storage. Apparently the Symm has a gatekeeper device that is used with the symmetrix CLI. We don't need the CLI, but I have these gatekeeper devices that constantly fill /var/adm/messages and the like with corrupt label errors. Is there anything I can do (short of deleting the devices on machine start) to get rid of them? Or should I just try to get our SAN guy to get the installer for the CLI? These things are getting annoying, and the devfsadmd daemon keeps rediscovering them on boot.

    Read the article

  • Can make the proxy settings invisible when I share my internet connection via wifi?

    - by Neil
    This is probably a long shot... I have an HTC Desire and frustratingly found out after I got it that it doesn't support network proxy settings. We have a wireless network at my office that uses a proxy. My desktop at work runs ubuntu. I was wondering if the following set up would work: Plug a USB Wireless adapter into the desktop that has a working internet connection using the proxy. Setup the wireless adapter as an ad-hoc network Share the internet connection over the ad-hoc network. Make it so that the use of the proxy is invisible to users of the shared network connection. Connect the Android phone to the ad-hoc wireless network and utilise the internet connection. My question is this: Is this possible or should I give up now and not even try? I think I can handle steps 1, 2, 3 and 5. I just have no idea if step 4 even makes sense, let alone is possible. Thanks

    Read the article

  • Apache mpm-itk Performance

    - by Matt Beckman
    I manage a bunch of VPSs with memory ranging from 1GB to 8GB. Most of these websites are Joomla websites, and the servers must support multiple sites/users/S-FTP. I use mpm-itk almost exclusively (mostly due to it's convenience in these shared environments). However, I'm aware it isn't known for performance, so I need some advice on making it faster. Due to the lack of documentation when I first went the way of mpm-itk, I included only one setting in the config, and that was to limit each user to 50 clients (the rest I left up to defaults): <IfModule mpm_itk_module> MaxClientsVHost 50 </IfModule> Are there any better alternatives available? Are there any settings supported in mpm-prefork or mpm-worker that are also supported in mpm-itk? Thanks!

    Read the article

  • ERROR: 6553609 You are not authorized to perform the current operation

    - by Tim
    I'm trying to backup a sub site in my protal using Smigrate. When logged into the server and running the following command: D:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\60\BINsmigrate -w -f C:\CH_Test_04_backup.fwp -u domain\user -pw ** 20 Nov 2009 13:57:06 -0000 Site collection opened 20 Nov 2009 13:57:07 -0000 Authenticating against the web server. 20 Nov 2009 13:57:07 -0000 Already authenticated against the web server. 20 Nov 2009 13:57:07 -0000 ERROR (possibly fatal): ERROR: 6553609 You are not authorized to perform the current operation. 20 Nov 2009 13:57:07 -0000 Site collection closed I get an error (ERROR: 6553609 You are not authorized to perform the current operation) and the sites does not get backed up. I've tried solutions oulined in various KB articles but had no luck with them. Can anyone help? Regards Tim

    Read the article

  • Backup Exec backup-to-disk folder creation - Access denied

    - by ewwhite
    I'm having a difficult time creating a backup-to-disk folder in Symantec Backup Exec 12.5 and Backup Exec 2010. The backend storage is a Nexenta/ZFS-based NAS filer sharing the volume via CIFS. I've also seen the issue on other *nix-based NAS devices. I've attempted mapping the drive, providing the full paths to the folder, etc. I can browse to the share just fine from within Windows, but Backup Exec fails to create the B2D folder with different variants of a Unable to create new backup folder. Access denied error. I've attempted creating service accounts in Backup Exec to handle the authentication, but nothing seems to work. What's the key to making this work?

    Read the article

  • Why not install Msvcr71.dll into system32?

    - by hillu
    While looking for an authoritative source for the missing Msvcr71.dll that is needed by a few old applications, I stumbled across the MSDN article Redistribution of the shared C runtime component in Visual C++. The advice given to developers is to drop the DLL into the application's directory instead of system32 since DLLs in this directory are considered before the system paths. What can/will go wrong if I (as an administrator, not a developer) decide to take the lazy path and install Msvcr71.dll (and Msvcp71.dll while I'm at it) into the system32 directory (of 32 bit Windows XP or Windows 7 systems) instead of putting a copy in each application's directory? Is there another good solution to provide the applications with the needed DLLs that doesn't involve copying stuff to the application directories?

    Read the article

< Previous Page | 268 269 270 271 272 273 274 275 276 277 278 279  | Next Page >