Search Results

Search found 40479 results on 1620 pages for 'binary files'.

Page 1086/1620 | < Previous Page | 1082 1083 1084 1085 1086 1087 1088 1089 1090 1091 1092 1093  | Next Page >

  • git init --bare permission denied on 16gb USB stick

    - by Sour Lemon
    I am using GIT on a Windows 7 machine (64 bit) and have been learning how to use GIT to version control my files. Now I want to be able to create a --bare repository on an external device (in this case a 16gb USB stick) but unfortunately when I try to create a --bare repository on it I get the following error: f:/: Permission denied I am using the GIT Bash program which is installed with GIT on Windows machines, so these are the commands I am typing in (I am also opening the program as administrator by holding ctrl + shift when I open it) cd /f git init --bare f:/: Permission denied However if I create a normal repository it works just fine: cd /f git init Initialized empty repository in f:/.git/ Can anybody shed some light on why I can't create a --bare repository? Any help would be much appreciated.

    Read the article

  • Custom Icon for NFS Volume Mount - Possible for OSX?

    - by James
    We are naming our various network volumes after Planets! I renamed the Mercury.icns icon, to .VolumeIcon.icns and copied it over to the mount point folder of the NFS server. So far remounting the NFS share does not seem to employ this icon. Looking on the NFS server, there appears to be two VolumeIcon files. Can someone tell me what I am doing wrong? Permissions? Do I need a .DS_Store file there as well?? It shouldn't be this hard! EDIT: Should have mentioned, the NFS server is Ubuntu 12.04.1. NOT an OSX server.

    Read the article

  • Where are ethernet errors logged?

    - by Matt
    Munin is showing me a graph like this: During that spike, I was unable to access my server through the eth0 port (I could access it through my IPMI port). I'm trying to figure out what happened, but I can't seem to locate any log files for eth0. I don't see anything in /var/log/(kern|syslog|messages) that is out of the ordinary. And I don't see a log file specifically for eth0. Are there logs for eth0, and if so, where can I find them? I am running Ubuntu 10.04 LTS.

    Read the article

  • Batch convert HTML file(s) saved using IE to MHT

    - by ultrasawblade
    I have numerous web sites that I've saved over the years. I used Internet Explorer's "Save As..." option to do this. It saves the original page as an .html document, and page requirements in a linked folder with the same name as a document. I want to convert a bunch of these (over 1000) to the single-file .mht format. This can be done through Internet Explorer or Firefox (using UnMHT extension) by loading the original .html document, then re-saving as an .mht document. It is tedious to do that for the number of files I'm talking about, obviously. I'm wondering if anyone knows of a utility, command line or otherwise, that can accomplish this.

    Read the article

  • File Sharing: User-created folders are read-only to others on Mac 10.6 Server

    - by Anriëtte Combrink
    Hi there We recently got a new Mac Mini Server with 10.6 Server on it. It has two 500GB volumes, one of which [Macintosh HD2 the extra one other than the boot disk] we are using to share our work files. I have added a user account for each user in the Users pane on Server Preferences, and all our staff (users added to the system) are added to a new group, called toolboxstaff. Now, when a user creates a new folder on this volume, folders are created with read-only access for everyone else besides the owner. How do I set it that when a user creates a folder, it creates it with RW access for the toolboxstaff group? Thanks in advance.

    Read the article

  • How to Remove a VM From Hyper-V Without Deleting the Configuration File?

    - by Steven Murawski
    I'm in the process of moving a number of virtual machines that are homed on shared storage (a file share, though shared cluster disk would work as well) to a new VM host with access to the same shared storage. The new host is a different build version (moving from Windows Server 2012 Beta to Windows Server 2012 RC - though this same process could be used with migrations of Windows Server 2008/2008 R2 to Windows Server 2012 as well), so I cannot migrate the machine with inbox tooling. I need to remove the VM from management of the source Hyper-V host in order to import the VM to the new Hyper-V host. I want to retain the configuration file, so I can import the VM as it stands and not need to reconfigure it. The VHD files are rather large and they are staying on the same file share, so I'd rather not duplicate them during the move process.

    Read the article

  • Strange rsync behaviour

    - by Stewie
    So, I start with comparing two directories: [root@135759 ]# rsync -av test1/ test2/ building file list ... done sent 128 bytes received 20 bytes 296.00 bytes/sec total size is 6 speedup is 0.04 They both are in sync Now, let me create a file in test1 and copy it to test2 [root@135759 ]# touch test1/hello4.php [root@135759 ]# scp test1/hello4.php test2/hello4.php I verify that those are same: [root@135759 test2]# md5sum test1/hello4.php d41d8cd98f00b204e9800998ecf8427e hello4.php [root@135759 test1]# md5sum test2/hello4.php d41d8cd98f00b204e9800998ecf8427e hello4.php Thus, running rsync will show me 0 files. Why is the output not right ? [root@135759 ]# rsync -avn test1/ test2/ building file list ... done hello4.php sent 116 bytes received 24 bytes 280.00 bytes/sec total size is 6 speedup is 0.04

    Read the article

  • Memcached server: Is it a good practice to point two server urls to the same server?

    - by Niro
    I have a system where there are connections to a memcache server from several different files and servers. I would like to stay with one server but keep the option of increasing the number of memcache servers (for periods of of high traffic). My idea is to tell memcache there are two servers, while the two urls will point (by DNS) to a single server. In the future if I want I can add a server and change DNS without changing the code in many places. Is this a good practice? Is there a performance cost to the fact that there are two server connections but they both point to the same server? Any other idea how to achive instant expeandability of memcache capacity without need to change the code and deploy ?

    Read the article

  • Command line tool for listing ID3 tags under Linux

    - by petersohn
    I want to write a script that manipulates ID3 tags of mp3 files. I need a tool that reads the tags and outputs it in a format in a machine-readable format. For example, if I want it to output only the title, then it outputs the title, nothing else. I tried different tools like id3 or eyeD3, but they can only be used to write tags or to output them in a human-readable format. Of course I could just filter that output through sed, but it seems unnecessarily complicated to me.

    Read the article

  • Sharing an Apache configuration between testing vs. production

    - by Kevin Reid
    I have a personal web site with a slightly nontrivial Apache configuration. I test changes on my personal machine before uploading them to the server. The path to the files on disk and the root URL of the site are of course different between the test and production conditions, and they occur many places in the configuration (especially <Directory blocks for special locations which have scripts or no directory listing or ...). What is the best way to share the common elements of the configuration, to make sure that my production environment matches my test environment as closely as possible? What I've thought of is to use SetEnv to store the paths for the current machine in environment variables, then Include a common configuration file with ${} everywhere there's something machine specific. Any hazards of this method?

    Read the article

  • How to create an alias for linux server name?

    - by Radek
    The openSUSE server name is 'darkhelmet'. I want to create an alias 'dh' for it. So I can also type 'ssh dh' and 'http://dh' would work too. What file/files and where do I have to edit to make this happen? Extract from /etc/hosts from darkhelmet 127.0.0.1 localhost # special IPv6 addresses ::1 localhost ipv6-localhost ipv6-loopback fe00::0 ipv6-localnet ff00::0 ipv6-mcastprefix ff02::1 ipv6-allnodes ff02::2 ipv6-allrouters ff02::3 ipv6-allhosts 127.0.0.2 darkhelmet.edumate darkhelmet 10.0.0.22 db2workgroup db2workgroup [root][skroob] nslookup darkhelmet Server: 10.0.0.10 Address: 10.0.0.10#53 Name: darkhelmet.edumate Address: 10.0.0.22

    Read the article

  • Playing an Online mp3

    - by Mohsen
    I have a problem with playing online mp3s. I'm using latest version of javazoom's jlayer and basicplayer. Here is the exception: Caused by: javazoom.jlgui.basicplayer.BasicPlayerException: java.io.EOFException at javazoom.jlgui.basicplayer.BasicPlayer.initAudioInputStream(Unknown Source) at javazoom.jlgui.basicplayer.BasicPlayer.open(Unknown Source) ... 12 more Caused by: java.io.EOFException at java.io.DataInputStream.readInt(DataInputStream.java:375) at com.sun.media.sound.WaveFileReader.getFMT(WaveFileReader.java:244) at com.sun.media.sound.WaveFileReader.getAudioFileFormat(WaveFileReader.java:85) at javax.sound.sampled.AudioSystem.getAudioFileFormat(AudioSystem.java:985) at javazoom.jlgui.basicplayer.BasicPlayer.initAudioInputStream(Unknown Source) ... 15 more My java is 1.6.0_16. Certain files cannot be player through the Internet. I have a set of mp3s, playing one after the other. Randomly one mp3 doesn't work throwing above exception. Some mp3s can be played by calling again play() method if javazoom's basicplayer, but others can never be played online. I was able to find this post but I doubt if this really relates to my directx version or something. Mohsen

    Read the article

  • Unable to connect SQL Server instance from Visual Studio 2008 SP1 on Vista x64

    - by Shimmy
    Hi folks! I installed on a Vista x64 machine Visual Studio 2008 SP1 (with integrated SQL from the installation package) and when I try to add an MDF file to a project or to the App_Data when working with web, I get the following message: Connections to SQL Server Files (*.mdf) require SQL Server Express 2005 to function properly. Please verify the installation of the component or download from the URl: http:go.microsoft.com/fwlink/?linkID=49251. Just to make sure: SQL 2005 express is installed and I connect to it via SSMS. Update: I am 90% sure that this is a Microsoft bug with x64 machines.

    Read the article

  • How do I access a shared folder using credentials other than the ones I logged in with?

    - by George Sealy
    I have a lab full of Windows 7 machines, and a shared login (user360) that all my students use. I also have a shared folder that they can all have read/write access to (for moving files around easily). My problem is that I also want to be able to create a shared folder for each student for submitting assignments. I can set up a shared folder with permissions for just a single user, and not the 'user360' account. The problem is, when I'm logged in as user360, and I try to open the 'StudentA', Windows never asks me for alternate credentials, it just refuses access because the user360 account is not allowed access. Can anyone suggest a fix for this?

    Read the article

  • Capistrano deploying to different servers with different authentication methods

    - by marimaf
    I need to deploy to 2 different server and these 2 servers have different authentication methods (one is my university's server and the other is an amazon web server AWS) I already have running capistrano for my university's server, but I don't know how to add the deployment to AWS since for this one I need to add ssh options for example to user the .pem file, like this: ssh_options[:keys] = [File.join(ENV["HOME"], ".ssh", "test.pem")] ssh_options[:forward_agent] = true I have browsed starckoverflow and no post mention about how to deal with different authentication methods this and this I found a post that talks about 2 different keys, but this one refers to a server and a git, both usings different pem files. This is not the case. I got to this tutorial, but couldn't find what I need. I don't know if this is relevant for what I am asking: I am working on a rails app with ruby 1.9.2p290 and rails 3.0.10 and I am using an svn repository Please any help os welcome. Thanks a lot

    Read the article

  • The eval(base64_decode()) virus has infected a server. Would removing executable permissions help solve the issue?

    - by Bravo.I
    The eval(base64_decode()) has infected a server. This is a PHP virus that uses the eval function in PHP and replicates itself to all the PHP files on the system as far as I'm certain. Would removing executable permissions help solve the problem?! Please answer really fast, and also, if you've got any better ideas on how to stop this virus.. I'm all ears. The virus has replicated itself to several folders in the directory and most of the other folders are actually several other websites...

    Read the article

  • XenServer: Editing clone configuration before boot

    - by Jeff Ferland
    Upon cloning a base image, I need to reconfigure basic settings. Regenerating the ssh host key, changing static IP assignments, setting the host name, etc. Because of the network setup, DHCP is not an option. That more or less rules out SSHing in with a predefined key or running a startup script since I can't provide the IP externally. I'd most like to mount the filesystem of the new machine on Dom0, but the lvm volumes are exported and it appears to be Bad Form to import them so the Dom0 machine can see them. What's your best suggestion for altering files in a cloned VM before boot? Must be non-interactive, and I'm going to guess out the gate that scripting access via xe console is not going to work well.

    Read the article

  • UEC - Can the Cluster Controller and Storage Controller be seperate systems?

    - by Jeremy Hajek
    My department is implementing an Ubuntu Enterprise Cloud. I have done the testing and am quite comfortable with the 4 pieces, CC/SC, CLC, WS, NC. Looking at various documents below it appears the the Storage Controller and Cluster Controller (eucalyptus-sc and eucalyptus-cc) are always installed on the same system. My question is this: can I install the storage controller and the cluster controller on separate systems? http://open.eucalyptus.com/wiki/EucalyptusAdvanced_v2.0 the picture indicates that cc and sc are two different machines http://www.canonical.com/sites/default/files/active/Whitepaper-UbuntuEnterpriseCloudArchitecture-v1.pdf P.10 1st paragraph uses the word "machine(s)" http://software.intel.com/file/31966 P. 8 indicates the same separate architecture BUT... https://help.ubuntu.com/community/UEC/PackageInstallSeparate indicates below that the SC and CC are to be on the same system.

    Read the article

  • Installing perforce visual client on linux

    - by Manish
    I am from Mac background trying my hand at installing perforce client visual(P4V) on my linux box.For this I download the correct version here and untar the files. Then I cd to the directory ~/Desktop/p4v-2012-blah-blah/bin I also say chmod +x p4* After this i try running p4v (by double clicking) but I dont see anything .The file type is shown as a "text executable" but i dont know why it is not running. On mac i had done the same thing -just clicked on p4v and the client would show up(where I filled the server address and everything )But not sure what is going wrong here.Can someone give me directions? FWIW i did check out this link .

    Read the article

  • Better filesystem for ZODB and Plone.app.blob

    - by h2o
    I am interested to know which Linux file system is best for hosting Zope's Object Database (Data.fs) and Plone.app.blob (blobstorage). Various reviews from the Internet suggests that XFS is good for hosting both. XFS is fast and works well with large files. Anyone using XFS for Zope or Plone? What optimization flags, if any, do you use for XFS? Note that this is a repost. I originally asked this on StackOverflow and was given a suggestion to post on Severfault instead.

    Read the article

  • Is there any way to make Mac OS X Spotlight only index the file names and not the contents?

    - by aalaap
    I do understand that the point of Spotlight is to look inside files, but it also returns file name matches, and that's what I need most of the time. Besides, Spotlight is running so absurdly slow on my system (Snow Leopard on the iMac '08), it's just unusable. I downloaded Canary and Spotlight wasn't able to find the app file for 15 minutes. It was already in the download stack, but as far as Spotlight goes, the file doesn't exist. Hence, I would like to know of a way to make Spotlight only index the file names, which would perhaps make it a bit faster. I'm looking at mimicking the behaviour of Windows applications such as AvaFind or Search Everything Edit: Let me highlight the fact that I am looking for an AvaFind or Search Everything replacement for Mac OS X. Go try one of these on a Windows machine and you'll understand my disappointment with Spotlight or any other popular search tools in OS X.

    Read the article

  • Resize videos with different widths to a fixed height preserving aspect ratio with ffmpeg

    - by Axarydax
    I'd like to convert a lot of video files to flash video for our company's website. I have a requirement that all of the videos must be in 360p format, so their size would be Nx360. FFMpeg uses -s argument to specify target resolution as WxH. I don't know Width, as it depends on source file aspect ratio. If source is 640x480, target will be 480x360. If source is 848x480, target will be 636x360. Is there a way to do it with some switch of ffmpeg? That it will preserve aspect ratio and I'll only specify the height of target video? I could easily solve it by making a program that will launch ffprobe to get source video size, calculate aspect ratio and then calculate a new width.

    Read the article

  • FTP upload stalls at same point every time on FileZilla

    - by John
    On two different FTP accounts, I am having problems uploading files. I can login and see the contents of the dir, and start an upload. Using Filezilla the transfer seems to always stall at either 0.9% or 1.2% (always those two numbers) and may simply hang, or keep restarting and then again stop at the same point. WindowsXP FTP is not great but I get similar types of problems there... it starts uploading and after a short while I get a timeout error. FTP used to work fine, and I don't know if it's these accounts in particular (both have the same service provider although purchased on opposite sides of the world) or if "FTP is broken on my PC"... can that even happen?!

    Read the article

  • Windows 7 won't read from NAS on LAN

    - by Alfy
    I've got a Linkstation NAS drive on a local network. Having just got a new laptop with Windows 7 Home Professional, I can no longer read anything of the drive. I've tried accessing the drive using \192.168.1.55\share, using ftp programs such as WinSCP, filezilla and even using firefox to hit ftp://192.168.1.55. The really annoying thing is that through these methods I can see the files on the drive, counting out any kind of connection issues. I can navigate through the NAS file system, but as soon as I try and copy a file off the NAS, things just stop working. Accessing the drive through a Windows XP machine works fine. So far I've tried: Disabling firewalls Adding the LmCompatibilityLevel key to HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Lsa Using the 40 - 56 bit encryption instead of the 128 bit. Has anyone got any suggestions of what I can check or try? This is driving me crazy and I'm totally out of ideas?

    Read the article

  • How can I tell which config file Apache is using?

    - by Claudiu
    I'm trying to set up virtual hosts on Mac OS X. I've been modifying httpd.conf and restarting the server, but haven't had any luck in getting it to work. Furthermore, I notice that it's not serving files in the DocumentRoot mentioned in httpd.conf (Libraries/WebServer/Documents), but in a different directory (/usr/local/apache2/htdocs). I don't see this folder mentioned anywhere in httpd.conf. Furthermore, PHP works, but the "LoadModule php5_module" line is commented out. This makes me think it's using another .conf file. How can I figure out which config is actually being loaded? Update: I just deleted that httpd.conf and apache behaves the same after restart, so it definitely wasn't using it!

    Read the article

< Previous Page | 1082 1083 1084 1085 1086 1087 1088 1089 1090 1091 1092 1093  | Next Page >