Search Results

Search found 19157 results on 767 pages for 'shared folder'.

Page 368/767 | < Previous Page | 364 365 366 367 368 369 370 371 372 373 374 375  | Next Page >

  • Two way sync with rsync

    - by mwm
    I have a folder a/ and a remote folder A/. I now run something like this on a Makefile: get-music: rsync -avzru server:/media/10001/music/ /media/Incoming/music/ put-music: rsync -avzru /media/Incoming/music/ server:/media/10001/music/ sync-music: get-music put-music when I make sync-music, it first gets all the diffs from server to local and then the opposite, sending all the diffs from local to server. This works very well only if there are just updates or new files on the future. If there are deletions, it doesn't do anything. In rsync there is --delete and --delete-after options to help accomplish what I want but thing is, it doesn't work on a 2-way-sync. If I want to delete server files on a syn, when local files have been deleted, it works, but if, for some reason (explained after) I have some files that aren't in the server but exist locally and they were deleted, I want locally to remove them and not server copied (as it happens). Thing is I have 3 machines in context: desktop notebook home-server So, sometimes, server will have files that were deleted with a notebook sync, for example and then, when I run a sync with my desktop (where the deleted server files still exist on) I want these files to be deleted and not to be copied again to the server. I guess this is only possible with a database and track of operations :P Any simple solutions? Thank you.

    Read the article

  • Write permission when mounting Windows shares from Ubuntu

    - by Ola Tuvesson
    I think I'm close to having my dev environment set up exactly the way I want, but one final snag remains. I'm running VirtualBox on a Windows 7 64bit host, with my dev enviroment inside a Ubuntu 12.04 guest. I want to keep the files for my projects on the host filesystem - partly so I can access them when the Ubuntu guest is not running, but also so I can use Tortoise and other Windows based tools (cough Photoshop), and it also eases my backup scheme somewhat. So I've got a folder "Rails" on my NTFS drive, which I've shared from the host with a user specifically created for the Ubuntu guest. The mount point has been set up and an entry added to fstab (cifs), using a credentials file and the options iocharset=utf8,file_mode=0777,dir_mode=07??77 This mounts fine and my Ubuntu user has both read and write permissions to the contents, but when I try to start my Rails app I get permission errors on any files the app needs to write to (e.g. the log file). What gives?

    Read the article

  • VirtualBox won't use any kind of hardware acceleration

    - by burnersk
    I see an problem with VirtualBox hardware acceleration functionality... My system configuration: MSI PH67A-C43 (B3) (BIOS: 2.70) Intel Core i5-2400 Windows 7 Professional SP1 64-bit Oracle VirtualBox 4.1.10 Oracle VirtualBox Extension Pack 4.1.10 I can select the individual acceleration options such as PAE/NX, VT-x/VMD-V or Nested Paging within VM-Settings but if I start the VM the accelerations will be disabled as of the tooltip from CPU (right next to shared folders). Each acceleration is "Disabled". Does this sounds familar to anybody? How can I solve this?

    Read the article

  • How to securely store and update backup on remote server via ssh/rsync

    - by Sergey P. aka azure
    I have about 200 Gb of pictures (let's say about 1 mb/file, 200k files) on my desktop. I have access (including root access) to remote linux server. And I want to have updateable backup of my pictures on remote server. rsync seems to be the right tool for such kind of job. But other people also have access (including root access) to this server and I want to keep my pictures private. So the question is: what is the best way to keep private files on remote "shared" linux server securely?

    Read the article

  • Sharepoint database connection issue after upgrade to SQL Server 2008 R2

    - by Neil Hoff
    I took a backup of all our Sharepoint WSS 3.0 databases and restored them to a new Windows 2008 R2 server. The new SQL server has the same name and IP address as the old one. The only difference between the two is the new one has SQL 2008 R2 and the old one has SQL 2005. When I navigate to the sharepoint url I get this error: Cannot connect to the configuration database. I checked the logs at this location: "%commonprogramfiles%/Microsoft Shared/web server extensions/12/Logs" and found this error: System.Data.SqlClient.SqlException: Login failed. The login is from an untrusted domain and cannot be used with Windows authentication. Any ideas?

    Read the article

  • ubuntu input/output error

    - by rplevy
    I'm having a problem with Ubuntu that I'm finding hard to troubleshoot for reasons that will become clear: reboot -bash: /sbin/reboot: Input/output error dmesg -bash: /bin/dmesg: Input/output error ps -e ps: error while loading shared libraries: /lib/libproc-3.2.8.so: cannot read file data: Input/output error lsof -bash: /usr/bin/lsof: Input/output error fsck -bash: /sbin/fsck: Input/output error badblocks -bash: /sbin/badblocks: Input/output error So I can't see what is going on, and I can't remotely reboot. What can I do to get to the bottom of this? Interestingly: init 0 Segmentation fault I can cat /var/syslog but not /var/log/messages or several other important files. less and more don't work, neither do tail or head, etc.

    Read the article

  • Sharing folders with VirtualBox, Win7 Host and Ubuntu 9.10 Guest

    - by unknown (google)
    I have a development setup from this tutorial: http://www.sitepoint.com/blogs/2009/10/27/build-your-own-dev-server-with-virtualbox/ But what I can't figure out is how to share a folder on my Ubuntu virtualized machine with the host Win7. I want to use a Windows text editor to edit code that's on my Ubuntu server. I've tried using the Shared Folders setting, adding "/var/www" but it says that the path is not absolute. When I click on "other", it only allows me to browse folders on my Win7 host. Both the host and guest are 64-bit OS. Thanks in advance!

    Read the article

  • wmpnetwork.exe service clogs CPU usage

    - by Brenton Taylor
    We have remote locations each with 2 ASUS media extenders that stream from a computer with a shared Media Player library. Lately, several of these locations experience the "wmpnetwork.exe" service throttling the CPU to 100% usage. Killing the service only results in it starting back up, and so far the only temporary solution is to uninstall Media Player. A lot of these computers are also about 3-5 years old. Could it just be a case of outdated hardware not being able to do everything we ask them to do? Edit: all running Windows XP and Windows Media Player 11

    Read the article

  • Global resources can't be resolved after publishing Website in VS2008

    - by Scoregraphic
    Hi there I have a web-project running in VS 2008. We have some global resource files (*.resx) in the App_GlobalResources folder for internationalisation. All this works like a charm on my local IIS installation out of VS. But when I publish my web-project to the local filesystem and/or another server, all the resources can no longer be found. So I guess the pre-compilation is somehow corrupting stuff. When I call the pre-compiled web, I get an error that the resource object with key xyz cannot be found, although it could be found before. I checked with .NET reflector if the resource stuff made it into the *.dlls. All those identifiers are there (bin/Web.dll, bin/<culture>/Web.resources.dll). The identifiers are loaded like this: <asp:MenuItem NavigateUrl="~/OrderNew.aspx" Text="<%$ Resources:MyProject, MenuNewOrder %>" Value="NewOrder"> The resource files are called MyProject.resx and MyProject.<culture>.resx where <culture> corresponds the the specific culture (i.e. MyProject.de-DE.resx). Any ideas how to solve this? I really appreciate any help. Thanks Edit: If I copy the App_GlobalResources folder manually to the output, the resources may be loaded normally. So I really really wonder what this pre-compilation is all about. I'm still interested in solving the issue "the right way".

    Read the article

  • NSString writeToFile operation couldn't be completed

    - by Chonch
    Hey, I have an xml file in my application bundle. I want to copy it to the documents folder at installation and then, every time the app launches, get the newest version of this file from the Internet. I use this code: // Check if the file exists in the documents folder NSString *documentsPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0]; if (![[NSFileManager defaultManager] fileExistsAtPath:[documentsPath stringByAppendingPathComponent:@"fileName.xml"]]) // If not, copy it there (from the bundle) [[NSFileManager defaultManager] copyItemAtPath:[[[NSBundle mainBundle] resourcePath] stringByAppendingPathComponent:@"OriginalFile.xml"] toPath:[documentsPath stringByAppendingPathComponent:@"fileName.xml"] error:nil]; // Get the newest version of the file from the server NSURL *url=[[NSURL alloc] initWithString:@"http://www.sitename.com/webservice.asmx/webserviceName"]; NSString *results = [[NSString alloc] initWithContentsOfURL:url]; // Replace the current version with the newest one, only if it is valid if (results != nil) [results writeToFile:[documentsPath stringByAppendingPathComponent:@"fileName.xml"] atomically:NO encoding:NSStringEncodingConversionAllowLossy error:nil]; The problem is that the writeToFile command always returns NO and the file's contents remain identical to the original file I included in my app bundle. I checked the value of results and it's correct. I also made sure that the app does perform the writeToString command, but still, it always returns NO. Can anybody tell me what I'm doing wrong? Thanks,

    Read the article

  • Bug: files uploaded via desktop or web client have hidden tag when listed via API

    - by Jon Webb
    Files uploaded to Google Drive sometimes incorrectly have a hidden tag when listed via the Document List v3 REST API: <category scheme='http://schemas.google.com/g/2005/labels' term='http://schemas.google.com/g/2005/labels#hidden' label='hidden'/> This happens if: a subfolder is created via the Google Drive desktop client and files are copied in, or a folder is uploaded via the Google Drive web client. The folder does not have the hidden tag, but the files that were uploaded do. The files do not have this tag if: they are individually uploaded via the Google Drive web client to the subfolder, or they are uploaded via the REST API to the subfolder, or they are uploaded via the desktop client to the My Drive root. The files and folders show up in Google Drive whether they have the hidden tag or not. We're using the API with the following scope: https://docs.google.com/feeds/ https://spreadsheets.google.com/feeds/ https://docs.googleusercontent.com/ I have verified and can recreate this with the OAuth 2.0 playground. Google Drive desktop client version 1.3.3209.2600 on Win7 32-bit I guess these must be bugs in the API...

    Read the article

  • Unable to connect to another computer from the Task Scheduler on Windows 7

    - by Clem
    I am getting the following error when trying to connect to another computer from the Task Scheduler on Windows 7: "The remote computer was not found." The computer that I am trying to connect to is definitely on the network as I can ping it and browse its shared folders in Windows Explorer. Note that I get the same error message when trying to perform the same operation from Performance Monitor. This suggests that I need to something to enable remote connection to the Task Scheduler. I am not very experienced with Windows administration and I am not sure where to look. To give a bit more context, I want to use the Task Scheduler to automatically start Perf Mon on a few machines at my company. I'd like to setup the Task Scheduler remotely. Does anyone know what I need to do?

    Read the article

  • Windows 7 can't make back up

    - by J. Pablo Fernández
    I've been trying to get Windows 7 to make a backup for a week or so. I'm backing up to a local NAS and I'm getting this error: Windows Backup: Troubleshooting Options Check your backup Windows Backup could not create a zip file. This could be because the drive that Windows is installed on does not have enough space or it could be a temporary error. Make sure you have at least 400 MB of free space and try again. Backup time: 2009-09-07 14:48 Backup location: \VANGELIS\Shared\Backup\lennon\ Error code: 0x81000015 My local hard disk has 290GB free and my NAS has 200GB free. Any ideas what might be wrong?

    Read the article

  • MSI Installer start auto-repair when service starts

    - by Josh Clark
    I have a WiX based MSI that installs a service and some shortcuts (and lots of other files that don't). The shortcut is created as described in the WiX docs with a registry key under HKCU as the key file. This is an all users install, but to get past ICE38, this registry key has to be under the current user. When the service starts (it runs under the SYSTEM account) it notices that that registry key isn't valid (at least of that user) and runs the install again to "repair". In the Event Log I get MsiInstaller Events 1001 and 1004 showing that "The resource 'HKEY_CURRENT_USER\SOFTWARE\MyInstaller\Foo' does not exist." This isn't surprising since the SYSTEM user wouldn't have this key. I turned on system wide MSI logging and the auto-repair created its log file in the C:\Windows\Temp folder rather than a specific user's TEMP folder which seems to imply the current user was SYSTEM (plus the log file shows the "Calling process" to be my service). Is there something I can do to disable the auto-repair functionality? Am I doing something wrong or breaking some MSI rule? Any hints on where to look next?

    Read the article

  • Which AMI should I use as a base for a Django application?

    - by Edan Maor
    I'm starting development of a Django application, on Amazon's Web Services. I'm looking to build an instance that will serve the Django. I don't have much experience with such things, having only used a shared host before (WebFaction). So I'm wondering, which AMI should I use as a base? I'm assuming I want an Ubuntu AMI, possibly with certain things like Apache pre-installed? One minor point: I'm planning to serve several different Django projects from the same instance. I use virtualenv on my dev machine right now to separate the different projects, I'm assuming I'll do the same on EC2. Thanks!

    Read the article

  • Issue binding Image Source dependency property

    - by Archana R
    Hello, I have created a custom control for ImageButton as <Style TargetType="{x:Type Button}"> <Setter Property="Template"> <Setter.Value> <ControlTemplate TargetType="{x:Type Local:ImageButton}"> <StackPanel Height="Auto" Orientation="Horizontal"> <Image Margin="0,0,3,0" Source="{Binding ImageSource}" /> <TextBlock Text="{TemplateBinding Content}" /> </StackPanel> </ControlTemplate> </Setter.Value> </Setter> </Style> ImageButton class looks like public class ImageButton : Button { public ImageButton() : base() { } public ImageSource ImageSource { get { return base.GetValue(ImageSourceProperty) as ImageSource; } set { base.SetValue(ImageSourceProperty, value); } } public static readonly DependencyProperty ImageSourceProperty = DependencyProperty.Register("Source", typeof(ImageSource), typeof(ImageButton)); } However I'm not able to bind the ImageSource to the image as: (This code is in UI Folder and image is in Resource folder) <Local:ImageButton x:Name="buttonBrowse1" Width="100" Margin="10,0,10,0" Content="Browse ..." ImageSource="../Resources/BrowseFolder.bmp"/> But if i take a simple image it gets displayed if same source is specified. Can anyone tell me what shall be done?

    Read the article

  • Fabric and cygwin don't work with windows UNC paths

    - by tcoopman
    I have some strange problems with fabric deployment to Windows Server 2008r2. The thing I try to accomplish is to copy some files to a shared folder with a fabric script (this script does a lot of other things too, but only this step gives me problems). This is the problem: When I try to access a UNC(Universal Naming convention) path I always get access denied kind of answers if I run the script in fabric. When I run the command in an ssh prompt (same user) it works fine. Examples: cmd: robocopy f:/.... //share result: in ssh this works fine, in fabric I get "Logon failure: the user has not been granted the requested logon type aat this computer." cmd: cd //share result: in ssh this works fine, in fabric I get "//share: Not a directory" Further information: uname -a and whoami return exact the same thing in fabric and ssh. I also tried things like mount, net use, but these commands all have kind of the same problem.

    Read the article

  • Starting with administering a linux box

    - by Josh K
    I'm rather new to this, and I'd like to play with administering a linux box. Things I need to know how to do: Setup subdomains Setup FTP accounts Setup full domains / add domains MySQL setup / install / management LAMP setup / install / management This is probably going on a CentOS distro. I'd like links or a break down on how I can learn to do this. I am comfortable with a command line, but I'm trying to move from shared hosting to a VPS and would like to have some idea on how deep the water is before I do.

    Read the article

  • Authorization error when testing FTP to UNC

    - by user64204
    We have a Windows Server 2008 R2 with Active Directory (hereafter called DC) running as a domain controller on which we have IIS and an FTP site installed. We have a second Server 2008 (hereafter called SHARE) which is joined to that domain and has a disk shared as a network share (\\share\Office). That network share is used as the ftp's physical path on DC. We've tested the FTP from the IIS FTP configuration panel, by clicking on Basic Settings... then Test Settings.... When setting Administrator as a username with the Connect as... option, everything is fine: When no user is provided we can the below error: Q1: Could someone explain in more understandable terms what is written in the Details text area?

    Read the article

  • Setting up NIS/NFS on Mac OS 10.6

    - by evan
    We have an Ubuntu NIS/NFS server at work and we recently got a few new iMacs. Is there a way to set them up so they can use the linux user accounts and mount the shared nfs files? Are there any guides on how to do this? I've been googling with no success. I tried getting NFS to work by connecting to the server via the Disk Utility but after I run 'sudo automount' from the command line and ls the directory I tried to mount it to (Volumes/nfs) it gives a permissions error. If there isn't a way to do this, anyone know of any not to complicated ways to share user accounts and files between mac and linux computers (and even hypothetically a windows computer one day?) I know its kind a of huge question, but I'll greatly appreciate any advice on the topic. Thanks!

    Read the article

  • Permission forbidden on localhost with apache2

    - by N Alex
    Here is what I am trying to do. I tried to add another folder to apache and I get the following error when trying to acces testing/index.html. The idea is that I would like to have for every customer a folder like /home/neagoe/Work/InterWebs/Projects/[PROJECT NAME]/CustomerProjects/website/dist. Forbidden You don't have permission to access /index.html on this server. Apache/2.2.22 (Ubuntu) Server at testing Port 80 Here are the steps that I followed: Step1: sudo chmod a+x /home/neagoe/Work/InterWebs/Projects/testing/CustomerProjects/website/dist Step2: sudo chown -R www-data:www-data /home/neagoe/Work/InterWebs/Projects/testing/CustomerProjects/website/dist sudo chmod -R 775 /home/neagoe/Work/InterWebs/Projects/testing/CustomerProjects/website/dist Step3: sudo adduser $USER www-data Step4: sudo a2enmod userdir Step5: sudo cp /etc/apache/sites-available/default /etc/apache/sites-available/testing I edited the file /etc/apache/sites-available/testing so it looks like this: <VirtualHost *:80> ServerAdmin webmaster@localhost ServerName testing DocumentRoot /home/neagoe/Work/InterWebs/Projects/testing/CustomerProjects/website/dist <Directory /> Options FollowSymLinks AllowOverride None </Directory> <Directory /home/neagoe/Work/InterWebs/Projects/testing/CustomerProjects/website/dist/ > Options Indexes FollowSymLinks MultiViews AllowOverride All Order allow,deny allow from all </Directory> ScriptAlias /cgi-bin/ /usr/lib/cgi-bin/ <Directory "/usr/lib/cgi-bin"> AllowOverride None Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch Order allow,deny Allow from all </Directory> ErrorLog ${APACHE_LOG_DIR}/error.log # Possible values include: debug, info, notice, warn, error, crit, # alert, emerg. LogLevel warn CustomLog ${APACHE_LOG_DIR}/access.log combined </VirtualHost> Step6: I edited hosts ("/etc/hosts") so it looks like this: 127.0.0.1 localhost 127.0.0.1 testing # The following lines are desirable for IPv6 capable hosts ::1 ip6-localhost ip6-loopback fe00::0 ip6-localnet ff00::0 ip6-mcastprefix ff02::1 ip6-allnodes ff02::2 ip6-allrouters Step7: sudo a2ensite testing sudo service apache2 restart I searched for about 2 hours on the internet but I can't figure out what went wrong. All the pages that I found following the same steps as described above. I know there are similar questions here on the internet, but the answer is to change permission to the directory which I did on Step2. I am sorry if this is really a duplicate but I could't find the right answer. Thank you! PS. I asked this also on AskUbuntu but didn't get any answers so I'm trying my luck here. Edit: There isn't much on the error log or the access log. On the access.log: ::1 - - [10/Aug/2013:11:23:28 +0300] "OPTIONS * HTTP/1.0" 200 126 "-" "Apache/2.2.22 (Ubuntu) (internal dummy connection)" ::1 - - [10/Aug/2013:11:23:29 +0300] "OPTIONS * HTTP/1.0" 200 126 "-" "Apache/2.2.22 (Ubuntu) (internal dummy connection)" ::1 - - [10/Aug/2013:11:23:31 +0300] "OPTIONS * HTTP/1.0" 200 126 "-" "Apache/2.2.22 (Ubuntu) (internal dummy connection)" ::1 - - [10/Aug/2013:11:23:32 +0300] "OPTIONS * HTTP/1.0" 200 126 "-" "Apache/2.2.22 (Ubuntu) (internal dummy connection)" ::1 - - [10/Aug/2013:11:23:33 +0300] "OPTIONS * HTTP/1.0" 200 126 "-" "Apache/2.2.22 (Ubuntu) (internal dummy connection)" ::1 - - [10/Aug/2013:11:23:34 +0300] "OPTIONS * HTTP/1.0" 200 126 "-" "Apache/2.2.22 (Ubuntu) (internal dummy connection)" ::1 - - [10/Aug/2013:11:23:35 +0300] "OPTIONS * HTTP/1.0" 200 126 "-" "Apache/2.2.22 (Ubuntu) (internal dummy connection)" 127.0.0.1 - - [10/Aug/2013:11:23:23 +0300] "POST /wordpress-testing/wp-cron.php?doing_wp_cron=1376123003.7026669979095458984375 HTTP/1.0" 200 705 "-" "WordPress/3.6; http://localhost/wordpress-testing" ::1 - - [10/Aug/2013:11:23:36 +0300] "OPTIONS * HTTP/1.0" 200 126 "-" "Apache/2.2.22 (Ubuntu) (internal dummy connection)" ::1 - - [10/Aug/2013:11:23:37 +0300] "OPTIONS * HTTP/1.0" 200 126 "-" "Apache/2.2.22 (Ubuntu) (internal dummy connection)" ::1 - - [10/Aug/2013:11:23:38 +0300] "OPTIONS * HTTP/1.0" 200 126 "-" "Apache/2.2.22 (Ubuntu) (internal dummy connection)" 127.0.0.1 - - [10/Aug/2013:11:31:32 +0300] "GET /index.html HTTP/1.1" 200 485 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:23.0) Gecko/20100101 Firefox/23.0" And the last line repeats for about 200 rows. On the error.log: 1. This lines repeat from time to time. PHP Warning: PHP Startup: Unable to load dynamic library '/usr/lib/php5/20100525 /msql.so' - /usr/lib/php5/20100525/msql.so: cannot open shared object file: No such file or directory in Unknown on line 0 [Sat Aug 10 13:06:42 2013] [notice] Apache/2.2.22 (Ubuntu) PHP/5.4.9-4ubuntu2.2 configured -- resuming normal operations [Sat Aug 10 13:07:36 2013] [notice] caught SIGTERM, shutting down PHP Warning: PHP Startup: Unable to load dynamic library '/usr/lib/php5/20100525/msql.so' - /usr/lib/php5/20100525/msql.so: cannot open shared object file: No such file or directory in Unknown on line 0 [Sat Aug 10 13:07:37 2013] [notice] Apache/2.2.22 (Ubuntu) PHP/5.4.9-4ubuntu2.2 configured -- resuming normal operations 2. And this is the predominant error. (hundreds of lines) [Sat Aug 10 13:07:40 2013] [error] [client 127.0.0.1] (13)Permission denied: access to /index.html denied

    Read the article

  • Windows Server 2008: Terminal Services / VDI

    - by JohnyD
    I have a Dell R710 with 72GB of memory running Hyper-V. Within Hyper-V I have a Windows 2008 (32-bit) VM running Terminal Services. How do I allocate memory so that any user who connects to this Terminal Server (from their thin-client) is allocated 2GB (or whatever amount I choose) of memory? Currently I have provisioned the TS with 2GB of memory but it seems that this is shared among all that connect. Please let me know if there is further information I can provide. Thank you. Update 1: What I'm looking to accomplish with this server is setting up a VDI to allow users to connect from thin-clients from within our network. They will also have to connect from outside our network via VPN which is already in place. Am I able to set this up using Windows Server 2008 (not R2) because I have a 16-bit application which needs to be supported. Unfortunately it's not a candidate as a Remote App.

    Read the article

  • Permissions on Mac OSX

    - by Linda
    I think that this is a permissions issue but I am not sure and I am not sure how to repair the problem. I have a new MacBook. I have 2 external drives that were previously used on another MacBook. I have a lot of folders and XCode projects on the external drives. When I try to work on the projects, there is a message similar to this: "This file is not writable. You may not be able to save your changes, but you will be able to Save a Copy somewhere else. Do you want to edit this file anyway?" If I make changes and try to close the project I get this error: "The project and user files project.pbxproj and macbook.pbxuser for project “thirdtry.xcodeproj” are not writeable and cannot be saved. Your changes will be lost if you close the project. You may need to SCM edit these files to gain writability." I have tried just to rename the folder but that permission is not allowed either unless I individually change permissions for every file in an XCode project. As you can imagine, this could be time consuming for tons of files and projects. I can copy the project into internal memory and can run it then after renaming the folder that contains all of the files. This defeats the purpose of having all of the projects on an external drive. Also, in XCode, there is no "Build and Run" there is only "Build and Debug" now. I don't know if this is related or not. Suggestions for how to repair all permissions to all files and folders on my external drives? What about the "Build and Debug" and no "Build and Run" choice? Thanks, Linda

    Read the article

  • Why not install Msvcr71.dll into system32?

    - by hillu
    While looking for an authoritative source for the missing Msvcr71.dll that is needed by a few old applications, I stumbled across the MSDN article Redistribution of the shared C runtime component in Visual C++. The advice given to developers is to drop the DLL into the application's directory instead of system32 since DLLs in this directory are considered before the system paths. What can/will go wrong if I (as an administrator, not a developer) decide to take the lazy path and install Msvcr71.dll (and Msvcp71.dll while I'm at it) into the system32 directory (of 32 bit Windows XP or Windows 7 systems) instead of putting a copy in each application's directory? Is there another good solution to provide the applications with the needed DLLs that doesn't involve copying stuff to the application directories?

    Read the article

  • Can make the proxy settings invisible when I share my internet connection via wifi?

    - by Neil
    This is probably a long shot... I have an HTC Desire and frustratingly found out after I got it that it doesn't support network proxy settings. We have a wireless network at my office that uses a proxy. My desktop at work runs ubuntu. I was wondering if the following set up would work: Plug a USB Wireless adapter into the desktop that has a working internet connection using the proxy. Setup the wireless adapter as an ad-hoc network Share the internet connection over the ad-hoc network. Make it so that the use of the proxy is invisible to users of the shared network connection. Connect the Android phone to the ad-hoc wireless network and utilise the internet connection. My question is this: Is this possible or should I give up now and not even try? I think I can handle steps 1, 2, 3 and 5. I just have no idea if step 4 even makes sense, let alone is possible. Thanks

    Read the article

< Previous Page | 364 365 366 367 368 369 370 371 372 373 374 375  | Next Page >