Search Results

Search found 77950 results on 3118 pages for 'large file upload'.

Page 307/3118 | < Previous Page | 303 304 305 306 307 308 309 310 311 312 313 314  | Next Page >

  • nginx server over https using up all available file handles

    - by mmr
    Hi all, So I have an nginx server that's working over https with Sinatra. When I try to download a jnlp file in a configuration that works fine over Mongrel and http (no s), the nginx server fails to serve the file with a 504 error. Subsequent checking of the logs states that this error is due to overflowing the available number of file handles, ie, "24: too many open files". Running sudo lsof -p <nginx worker pid> gets me a huge list of files, all looking like: nginx 1771 nobody 11u IPv4 10867997 0t0 TCP localhost:44704->localhost:https (ESTABLISHED) nginx 1771 nobody 12u IPv4 10868113 0t0 TCP localhost:https->localhost:44704 (ESTABLISHED) nginx 1771 nobody 13u IPv4 10868114 0t0 TCP localhost:44705->localhost:https (ESTABLISHED) nginx 1771 nobody 14u IPv4 10868191 0t0 TCP localhost:https->localhost:44705 (ESTABLISHED) nginx 1771 nobody 15u IPv4 10868192 0t0 TCP localhost:44706->localhost:https (ESTABLISHED) nginx 1771 nobody 16u IPv4 10868255 0t0 TCP localhost:https->localhost:44706 (ESTABLISHED) nginx 1771 nobody 17u IPv4 10868256 0t0 TCP localhost:44707->localhost:https (ESTABLISHED) nginx 1771 nobody 18u IPv4 10868330 0t0 TCP localhost:https->localhost:44707 (ESTABLISHED) nginx 1771 nobody 19u IPv4 10868331 0t0 TCP localhost:44708->localhost:https (ESTABLISHED) nginx 1771 nobody 20u IPv4 10868434 0t0 TCP localhost:https->localhost:44708 (ESTABLISHED) Increasing the number of files that can be opened is no help, because then nginx just blows right past that limit. And no wonder, it looks like it's in some kind of loop to pull all available files. Any idea what's going on, and how to fix it?

    Read the article

  • Joomla 1.5 Media Manager sets incorrect file permissions when uploading

    - by Scott Mayfield
    Howdy all, I have a Joomla 1.5 installation running on Windows Server 2008, installed via the Web Platform Installer. When uploading images with the media manager (native uploader, not the flash bulk uploader), the files arrive on the server correctly, but are given incorrect permissions. Specifically, the IIS_IUSRS group is not given access to the file. I might be incorrect about what group/user is SUPPOSED to get access to the files, but so far, I've found that unless I give IIS_IUSRS access to the uploaded files, they won't appear on the site or in the media manager (appear as broken images). Once I give IIS_IUSRS permission to the files, they work fine. So far, all the research I've done has led me to linux specific fixes that involve either changing the umask on the server, or directly modifying the Joomla codebase to add an appropriate chmod command to the upload process, but I really don't want to modify Joomla directly. I have to believe there's a setting here somewhere that will do the job, either on the Joomla or Windows side of the equation. Any thoughts? Scott

    Read the article

  • Joomla 1.5 Media Manager sets incorrect file permissions when uploading

    - by Scott Mayfield
    Howdy all, I have a Joomla 1.5 installation running on Windows Server 2008, installed via the Web Platform Installer. When uploading images with the media manager (native uploader, not the flash bulk uploader), the files arrive on the server correctly, but are given incorrect permissions. Specifically, the IIS_IUSRS group is not given access to the file. I might be incorrect about what group/user is SUPPOSED to get access to the files, but so far, I've found that unless I give IIS_IUSRS access to the uploaded files, they won't appear on the site or in the media manager (appear as broken images). Once I give IIS_IUSRS permission to the files, they work fine. So far, all the research I've done has led me to linux specific fixes that involve either changing the umask on the server, or directly modifying the Joomla codebase to add an appropriate chmod command to the upload process, but I really don't want to modify Joomla directly. I have to believe there's a setting here somewhere that will do the job, either on the Joomla or Windows side of the equation. Any thoughts? Scott

    Read the article

  • Joomla 1.5 Media Manager sets incorrect file permissions when uploading

    - by Scott Mayfield
    Howdy all, I have a Joomla 1.5 installation running on Windows Server 2008, installed via the Web Platform Installer. When uploading images with the media manager (native uploader, not the flash bulk uploader), the files arrive on the server correctly, but are given incorrect permissions. Specifically, the IIS_IUSRS group is not given access to the file. I might be incorrect about what group/user is SUPPOSED to get access to the files, but so far, I've found that unless I give IIS_IUSRS access to the uploaded files, they won't appear on the site or in the media manager (appear as broken images). Once I give IIS_IUSRS permission to the files, they work fine. So far, all the research I've done has led me to linux specific fixes that involve either changing the umask on the server, or directly modifying the Joomla codebase to add an appropriate chmod command to the upload process, but I really don't want to modify Joomla directly. I have to believe there's a setting here somewhere that will do the job, either on the Joomla or Windows side of the equation. Any thoughts? Scott

    Read the article

  • Creating a tar file with checksums included

    - by wazoox
    Here's my problem : I need to archive to tar files a lot ( up to 60 TB) of big files (usually 30 to 40 GB each). I would like to make checksums ( md5, sha1, whatever) of these files before archiving; however not reading every file twice (once for checksumming, twice for tar'ing) is more or less a necessity to achieve a very high archiving performance (LTO-4 wants 120 MB/s sustained, and the backup window is limited). So I'd need some way to read a file, feeding a checksumming tool on one side, and building a tar to tape on the other side, something along : tar cf - files | tee tarfile.tar | md5sum - Except that I don't want the checksum of the whole archive (this sample shell code does just this) but a checksum for each individual file in the archive. I've studied GNU tar, Pax, Star options. I've looked at the source from Archive::Tar. I see no obvious way to achieve this. It looks like I'll have to hand-build something in C or similar to achieve what I need. Perl/Python/etc simply won't cut it performance-wise, and the various tar programs miss the necessary "plugin architecture". Does anyone know of any existing solution to this before I start code-churning ?

    Read the article

  • cloning a kvm guest os to a vmdk file

    - by Bond
    I have a production environment where I am having 4 Guest OS running on a Ubuntu server which uses kvm. These OS are in an LVM based setup.I want these Virtual Machines to be in a vmdk format also.Where people would do experiments with these Virtual Machines so this in a vmware environment (or it can be Xen too) would be different from the kvm server.I would not have any control on that other environment so I want to give people vmdk images of these virtual machines. The production Virtual Machines will still keep running on kvm server but the VMs on which experiments would be done would be of type vmdk.(vmdk is a constraint) Here is output of lvscan ACTIVE '/dev/abcd/lvm1' [100.00 GiB] inherit ACTIVE '/dev/abcd/lvm2' [150.00 GiB] inherit ACTIVE '/dev/abcd/lvm3' [50.00 GiB] inherit ACTIVE '/dev/abcd/lvm4' [100.00 GiB] inherit I was reading man page of qemu-img and what I understand is I need to first create a qcow image file which I need to populate and then convert that to a vmdk file. Is that understanding correct? Now suppose /dev/abcd/lvm4 is the virtual machine with which I am going to start this experiment.I can shutdown the production VMs for some time to do this. So is the following way correct to go on server 1 (where kvm is running) qemu-img convert -c -f raw -O vmdk /dev/abcd/lvm4 /backup/lvm4.img or it will affect the lvm4 on kvm server 1. I do not want the VM running on original server to at all loose its any of the content but also have a vmdk file for each of the Guest OS on kvm. Before I proceed with any of the above things on the production machine I just want to make sure that I am doing the correct thing so I asking here.

    Read the article

  • What does it mean for the file name to be shown with red background

    - by user56614
    I'm trying to install Cisco VPN client on Linux Ubuntu 10.04. The installer creates the directory, places all the necessary files in it, and then fails to launch the binary. I tried to launch it myself, the system rebukes me too. Closer inspection yields the following: eugene@eugene-desktop:/opt/cisco/vpn/bin$ sudo chmod u+x vpnagentd eugene@eugene-desktop:/opt/cisco/vpn/bin$ ls -la total 5124 drwxr-xr-x 2 root root 4096 2010-10-23 11:51 . drwxr-xr-x 6 root root 4096 2010-10-23 11:51 .. -rwxr-xr-x 1 root root 1607236 2010-10-23 11:51 vpn -rwsr-xr-x 1 root root 1204692 2010-10-23 11:51 vpnagentd -r--r--r-- 1 root root 697380 2010-10-23 11:51 vpndownloader.sh -rwxr-xr-x 1 root root 1712708 2010-10-23 11:51 vpnui -rwxr-xr-x 1 root root 3654 2010-10-23 11:51 vpn_uninstall.sh eugene@eugene-desktop:/opt/cisco/vpn/bin$ ./vpnagentd bash: ./vpnagentd: No such file or directory eugene@eugene-desktop:/opt/cisco/vpn/bin$ sudo ./vpnagentd sudo: unable to execute ./vpnagentd: No such file or directory The file name "vpnagentd" is shown in white letters with red background. The other three executables are in green letters with black background, as expected. Any ideas?

    Read the article

  • Distributed File Systems.

    - by GruffTech
    So, I've been reading several articles around ServerFault as well as google. (For Example, this link) My Requirements are very similar to the link above, however i'd like to also have dynamic or at least resizeable file volumes, so if necessary i can add 4-5 servers to the pool, and then expand the volume. Any Distributed File systems that support that, to save me some time? Thanks! LustreFS will be my next test cluster to build. GlusterFS I've build a 3-machine test GlusterFS cluster, However i quickly became aware of several of its limitations that it doesn't seem to make clearly public. One, i can't seem to resize a volume. Once a volume is created, its done. Which seems retarded, why have a fully scalable file system if i can't scale a volume? So maybe i'm doing something wrong. I'm not sure. AmazonS3 while gives the cheapest startup adds too much cost when broken down to per client per month, so its out. Building my own system when prorated over several years with no bandwidth costs makes it significantly cheaper. MogileFS isn't an option as we'd like this server to be a SAN-Replacement, for storing tons of media from a multitude of systems, which for us means it needs to be POSIX compliant so it can be remotely mounted via NFS or CIFS.

    Read the article

  • Xen domU passwd file overwritten with console log output

    - by malfy
    I was setting up a Debian Xen domU and after booting it fine, I added basic configuration to /etc/network/interfaces and ran /etc/init.d/networking restart. This failed so I decided to reboot. After the reboot I also ran xm shutdown box. When dropped to a shell prompt it wouldn't let me login. Upon further inspection, I now have garbage in some critical files in /etc: root@box:/# tail +1 mnt/etc/{passwd-,shadow} tail: cannot open `+1' for reading: No such file or directory ==> mnt/etc/passwd- <== 0000000000100000 (reserved) Nov 23 02:02:39 box kernel: [ 0.000000] Xen: 0000000000100000 - 0000000004000000 (usable) Nov 23 02:02:39 box kernel: [ 0.000000] DMI not present or invalid. Nov 23 02:02:39 box kernel: [ 0.000000] last_pfn = 0x4000 max_arch_pfn = 0x1000000 Nov 23 02:02:39 box kernel: [ 0.000000] initial memory mapped : 0 - 033ff000 Nov 23 02:02:39 box kernel: [ 0.000000] init_memory_mapping: 0000000000000000-0000000004000000 Nov 23 02:02:39 box kernel: [ 0.000000] NX (Execute Disable) protection: active Nov 23 02:02:39 box kernel: [ 0.000000] 0000000000 - 0004000000 page 4k Nov 23 02:02:39 box kernel: [ 0.000000] kernel direct mapping tables up to 4000000 @ 7000-2c000 Nov 23 02:02:3 ==> mnt/etc/shadow <== 32 nr_cpumask_bits:32 nr_cpu_ids:1 nr_node_ids:1 Nov 23 02:02:39 box kernel: [ 0.000000] PERCPU: Embedded 15 pages/cpu @c15b0000 s37688 r0 d23752 u65536 Nov 23 02:02:39 box kernel: [ 0.000000] pcpu-alloc: s37688 r0 d23752 u65536 alloc=16*4096 Nov 23 02:02:39 box kernel: [ 0.000000] pcpu-alloc: [0] 0 Nov 23 02:02:39 box kernel: [ 0.000000] Xen: using vcpu_info placement Nov 23 02:02:39 box kernel: [ 0.000000] Built 1 zonelists in Zone order, mobility grouping on. Total pages: 16160 Nov 23 02:02:39 box kernel: [ 0.000000] Kernel command line: root=/dev/mapper/xen-guest_root ro quiet root=/dev/xvda1 ro Nov 23 02:02:39 box kernel: [ 0.000000] PID hash table entries: The garbage is also present in the passwd file and the group file (although I didn't paste that above since I have since ran debootstrap on the filesystem again). Does anyone have any insight into what happened and why?

    Read the article

  • AVCHD MTS h264 1080p file with choppy playback in Linux

    - by marc
    When I'm trying play video files from my camera: Seems stream 0 codec frame rate differs from container frame rate: 50.00 (50/1) -> 50.00 (50/1) Input #0, mpegts, from '00027.MTS': Duration: 00:00:38.88, start: 2.884289, bitrate: 16945 kb/s Program 1 Stream #0.0[0x1011]: Video: h264 (High), yuv420p, 1920x1080 [PAR 1:1 DAR 16:9], 50 fps, 50 tbr, 90k tbn, 50 tbc Stream #0.1[0x1100]: Audio: ac3, 48000 Hz, stereo, s16, 256 kb/s … on my Linux computer (Ubuntu 12.04), I get choppy playback. It's completly unusable... I tried: Totem VLC mplayer The result is always same issue. I sent the same video file to a friend who has ubuntu 10.04 to test, and he also has the same issue. He has Windows 7, and confirms that on Windows, the video work well. I have an Intel® Core™2 CPU 6300 @ 1.86GHz × 2 with GF 9600 GT, with closed NVIDIA drivers. This is not any kind of issue with big files playing slow from an HDD issue. I have an SSD drive! I spent the last days and nights, trying hundreds of commands for ffmpeg, handbrake, mencoder... Any of them won't let me create a file with enough quality. I downloaded few movies from YouTube in 1080p, and playback worked well without any big pixels and choppiness. I would like have highest possible quality, I will put following files onto a Blu-ray disk so I don't need to compress them to get a smaller size. I just want smoth playback on my Linux box. On Windows, the same file is working well.

    Read the article

  • Best Amazon S3 File Manager Utility?

    - by mmacaulay
    Ever since I started using Amazon's S3 service I've been struggling to find a good solution for simple file management, without having to write my own app to browse my buckets, upload and delete files, etc. The best I've found so far to do this is S3Fox. But it's far from perfect, it has problems deleting files and folders. Comments on the Firefox plugin page indicate I'm not the only person with this problem and the developer does not respond to emails. I've looked around briefly, but couldn't find anything that looked any better than S3Fox. Please tell me there's a better way! Edit (07/26/2009): The Firefox S3Fox extension seems to be getting love from its developer again, the problems I was having before have gone away, and I'm using it on a regular basis now with no problems!

    Read the article

  • CentOS / Redhat: Give file permission for apache and vsftp

    - by paskster
    I use CentOS 5.5 and Apache Webserver on my dedicated Server. My Folder "/var/www/myWebApp" is owned by apache, so that apache can read, write logs, etc.. But now I would like to use very secure FTP (vsftp) to upload my new files. I used to give every user rwx -Acess to "/var/www/myWebApp", but I guess this is way to insecure. On CentOS I created another user "ftpuser" for uploading files and that has "/var/www/myWebApp" as its home directory. How can I give him the permission to write into the "/var/www/myWebApp" without giving every user the same rights?

    Read the article

  • Accessing a website's directory in IIS from File Zilla

    - by Cdeez
    I have my Asp.net website deployed in my IIS's Virtual directory. Usually a FTP software like File Zilla is used to upload files to a website's directory from a remote system. File Zilla asks for a Host name, Username, password to connect to the remote server. Now all I want is my users in LAN should be able to access this directory from their system using FTP software like FileZilla. So how can I provide the Host name, username and password to my website's directory. I tried to find it on google but no help. Detailed steps please. Its IIS 5.1 version.

    Read the article

  • VSFTPD - FTP over TLS - Upload stops after exactly 82k?

    - by Redsandro
    I installed a VSFTP daemon on a CentOS server, using a RSA certificate for logging in using explicit TLS. Now, I cannot upload more than 82k. With files under that limit, there is no problem. The FTP works like a charm. But as soon as a file reaches 82k with FileZilla (81,952 bytes to be exact), the transfer will stop, and the FTP client hangs until time out is reached. FTP client console: 15:10:21 Command: STOR jquery-1.7.2.min.js 15:10:21 Response: 150 Ok to send data. 15:11:21 Error: Connection timed out 15:11:21 Error: File transfer failed after transferring 82 KB in 60 seconds /var/log/vsftpd.log FTP command: Client "x.x.x.x", "STOR jquery-1.7.2.min.js" FTP response: Client "x.x.x.x", "150 Ok to send data." OK UPLOAD: Client "x.x.x.x", "jquery-1.7.2.min.js", 81952 bytes, 1.32Kbyte/sec FTP response: Client "x.x.x.x", "226 File receive OK." // NOT okay, file is bigger // No mention of error here I cannot find relevant info about this problem, apart from a possible problem with trans_chunk_size (not mentioned in default config), but I tried different sizes and it has no impact on the problem. trans_chunk_size=4096 trans_chunk_size=8192 trans_chunk_size=9999 Ofcourse, after every configuration change, I restarted the server: /etc/init.d/vsftpd restart What else can cause this? It's not the latest version, but it's the latest update within the repositories that has been deemed fit for enterprise usage: Package info: $ yum info vsftpd Loaded plugins: fastestmirror Installed Packages Name : vsftpd Arch : x86_64 Version : 2.0.5 Release : 24.el5_8.1 Size : 286 k Repo : installed Summary : vsftpd - Very Secure Ftp Daemon URL : http://vsftpd.beasts.org/ License : GPL Description: vsftpd is a Very Secure FTP daemon. It was written completely from scratch.

    Read the article

  • Windows Server 2003 Exchange OWA email file attachment relogin

    - by ton
    hello guys i have a exchange server running but there is a problem. when i use the web access and i click new email ad i click on the paperclip(to add a attachment) i select a file and then i click Add to upload it. but when my Oulook Web Access hangs its stays loading 10 or 20 seconds. and then it ask for my login again my username and password. when i fill my username and password in again the same happens its stays loading for 10 or 20 seconds and then it ask again for my username and password. can somebody help me on this pleas.

    Read the article

  • Not able to delete file from server with permissions of 644 via PHP script

    - by letseatfood
    I am trying to delete JPEG files that were uploaded to the server via FTP. The files are uploaded and written with permissions of 644. The owner and group of the upload directory are mike and mike. I have tried changing the owner and group to www-data, but that does not seem to work. I am trying to delete the files with a PHP script using unlink(). This works on the production server (which is a hosting service), but not my development server, which is a LAMP setup. This leads me to believe it has something to do with permissions on my development server.

    Read the article

  • Joomla 1.5 Media Manager sets incorrect file permissions when uploading

    - by Scott Mayfield
    Howdy all, I have a Joomla 1.5 installation running on Windows Server 2008, installed via the Web Platform Installer. When uploading images with the media manager (native uploader, not the flash bulk uploader), the files arrive on the server correctly, but are given incorrect permissions. Specifically, the IIS_IUSRS group is not given access to the file. I might be incorrect about what group/user is SUPPOSED to get access to the files, but so far, I've found that unless I give IIS_IUSRS access to the uploaded files, they won't appear on the site or in the media manager (appear as broken images). Once I give IIS_IUSRS permission to the files, they work fine. So far, all the research I've done has led me to linux specific fixes that involve either changing the umask on the server, or directly modifying the Joomla codebase to add an appropriate chmod command to the upload process, but I really don't want to modify Joomla directly. I have to believe there's a setting here somewhere that will do the job, either on the Joomla or Windows side of the equation. Any thoughts? Scott

    Read the article

  • File replication among EC2 instances

    - by Peuge
    I am pretty new to AWS so please excuse my ignorance. We are wanting to have a setup whereby we have a SQL DB instance + web server instance. However we would like the Web server to sit behind an ELB thus allowing us to use Autoscaling. My question however is how to we replicate the web app across instances? Say for example we have two web servers running and we need to make a critical update to the web app, ultimately we would only want to upload to one instance and not both. Is it even best practice to store your web app on the instance or are there better ways to store and share the app between instances?

    Read the article

  • Workflow: suggest a versioning and file control for Designer and Developer

    - by Pennf0lio
    Our company are having hard time managing project files and managing versions of PSD, HTML, PHP, and CSS files. Can anyone recommend a good software or workflow to handle files and versions. Here's my common scenario: I work for a project in my computer, it could be a Website mockup or a coding project. I then save all the files locally in my workstation. I'll then upload all the project files in the server connected in our network to have a backup. In my files, I usually append a "r1" for revisions, like "WebsiteMockup_r1" or "WebsiteMockup_r2". I need somehow to synchronize all my local files to the server and have some versions options.

    Read the article

  • web.config file changings guide

    - by Student
    Hi experts how are you all? i am student, and learning asp.net c# visual studio 2010 with using sql server 2005. I have developed a website which has database. I developed this website with self studies taking help from internet. the website is completed and working perfectly in my computer. I have hosting server and domain name registered already. the problem is when I upload my website it doesn't work there the following error displays: Server Error in '/' Application. Configuration Error Description: An error occurred during the processing of a configuration file required to service this request. Please review the specific error details below and modify your configuration file appropriately. Parser Error Message: Unrecognized attribute 'targetFramework'. Note that attribute names are case-sensitive. Source Error: Line 11: <system.web> Line 12: <customErrors mode="Off" /> Line 13: <compilation debug="false" targetFramework="4.0"/> Line 14: </system.web> Line 15: </configuration> Source File: C:\Inetpub\vhosts\urdureport.com\httpdocs\web.config Line: 13 Version Information: Microsoft .NET Framework Version:2.0.50727.5472; ASP.NET Version:2.0.50727.5474 I don't know what should I do to get it work on hosting server please help me in this regard that what should I do with this. Thank you in advance

    Read the article

  • How to import a text file into powershell and email it, formatted as HTML

    - by Don
    I'm trying to get a list of all Exchange accounts, format them in descending order from largest mailbox and put that data into an email in HTML format to email to myself. So far I can get the data, push it to a text file as well as create an email and send to myself. I just can't seem to get it all put together. I've been trying to use ConvertTo-Html but it just seems to return data via email like "pageFooterEntry" and "Microsoft.PowerShell.Commands.Internal.Format.AutosizeInfo" versus the actual data. I can get it to send me the right data if i don't tell it to ConvertTo-Html, just have it pipe the data to a text file and pull from it, but it's all ran together with no formatting. I don't need to save the file, i'd just like to run the command, get the data, put it in HTML and mail it to myself. Here's what I have currently: #Connects to Database and returns information on all users, organized by Total Item Size, User $body = Get-MailboxStatistics -database "Mailbox Database 0846468905" | where {$_.ObjectClass -eq “Mailbox”} | Sort-Object TotalItemSize -Descending | ft @{label=”User”;expression={$_.DisplayName}},@{label=”Total Size (MB)”;expression={$_.TotalItemSize.Value.ToMB()}} -auto | ConvertTo-Html #Pause for 5 seconds for Exchange write-host -foregroundcolor Green "Pausing for 5 seconds for Exchange" Start-Sleep -s 5 $toemail = "[email protected]" # Emails report to this address. $fromemail = "[email protected]" #Emails from this address. $server = "Exchange.company.com" #Exchange server - SMTP. #Email the report. $email = New-Object System.Net.Mail.MailMessage $email.IsBodyHtml = $True $email.To.Add($toemail) $email.From = $fromemail $email.Subject = "Exchange Mailbox Sizes" $email.Body = $body $client = New-Object System.Net.Mail.SmtpClient $server $client.UseDefaultCredentials = $true $client.Send($email) Any thoughts would be helpful, thanks!

    Read the article

  • How to extract jpegs from a video file using ffmpeg

    - by Andrew Simpson
    I am using C# and ffmpeg. In this scenario I have 279 individual jpegs and i have used ffmpeg to create a AVI file from these images on my client. CMD Line: -f image2 -r 10 -i "C:\000EC902F17F\img%05d.jpg" -s 352x288 -y "C:\1\test.avi" I then upload to my server. CMD Line: -i c:\1\1.avi c:\1\img-%05d.jpg I then extract jpegs from the AVI file. I get 265 jpegs back. Obviously ffmpeg is dropping these frames (most probable) when the avi is 1st created. Is there a way to 'force' to encode using ALL the images I have? Thanks. PS I did not specify any command line option other than the size of the video output. As far as I am aware if none are specified then ffmpeg automatically chooses the best ones?

    Read the article

  • User-unique .vimrc file for servers as root user

    - by Scott
    I'm getting thrown into an IDE war at the office, where multiple users have root access on our servers, and like to have everything their own way with VIM. Unfortunately, we have our servers locked down enough to where if you want to do anything, you need to have root access. Obviously (although this is obviously frowned upon), we get tired of typing sudo before each command we type, which would require that we constantly type in our wonderfully complex passwords that are mandated on us over and over again, so naturally we all just execute the sudo su - command upon login to avoid all of this. Of course, when it comes to VIM and custom .vimrc files, we are often times stepping on someone else's custom .vimrc file, and we have some whacked out functionality in these files that users have that may overwrite functionality that we have no idea about, much less have the patience to learn either. When as root on a linux box, is there any way for all of us to still maintain our .vimrc file without having to overwrite the file over and over again every time someone wants to use VIM? Ideally, we have many virtual machines all with VIM installed, so a universal solution across all servers would be best, and we do have our Microsoft Windows user specific home directories mounted on the servers under /home/username. Any recommendations for accommodating this?

    Read the article

  • Command or tool to display list of connections to a Windows file share

    - by BizTalkMama
    Is there a Windows command or tool that can tell me what users or computers are connected to a Windows fileshare? Here's why I'm looking for this: I've run into issues in the past where our deployment team has deployed BizTalk applications to one of our environments using the wrong bindings, leaving us with two receive locations pointing to the same file share (i.e. both dev and test servers point to dev receive location uri). When this occurs, the two environments in question tend to take turns processing the files received (meaning if I am attempting to debug something in one environment and the other environment has picked the file up, it looks as if my test file has disappeared into thin air). We have several different environments, plus individual developer machines, and I'd rather not have to check each individually to find the culprit. I'm looking for a quick way to detect what locations are connected to the share once I notice my test files vanishing. If I can determine the connections that are invalid, I can go directly to the person responsible for that environment and avoid the time it takes to randomly ask around. Or if the connections appear to be correct, I can go directly to troubleshooting where in the process the message gets lost. Any suggestions?

    Read the article

  • Subversion COPY/MOVE - File not found: transaction 'XXX-XX'

    - by theplatz
    I'm attempting to create a branch in one of my subversion repositories and keep running into an error. No mater what is done, I keep getting the following: File not found: transaction '3062-2e6', path '/Software/XXXXXX/branches/testbranch' I've noticed that the first part of the '3063-3e6' in the above message is the last successful committed revision in the repository. My apache logs don't give much more information: [Wed Nov 24 14:10:38 2010] [error] [client x.x.x.x] Could not MOVE/COPY /svn/p070361/!svn/bc/3049/Software/SXXXXXX/trunk. [404, #0] [Wed Nov 24 14:10:38 2010] [error] [client x.x.x.x] Unable to make a filesystem copy. [404, #160013] [Wed Nov 24 14:10:38 2010] [error] [client x.x.x.x] File not found: transaction '3059-2e2', path '/Software/XXXXXX/branches/testbranch' [404, #160013] This is all happening on a server with an nginx frontend that proxies to Apache for the subversion bits. Other repositories are able to branch fine and I was able to create the branch using file:/// from the command line on the server this is occurring on. The permissions on this repository match every other repository and disk space is not an issue.

    Read the article

< Previous Page | 303 304 305 306 307 308 309 310 311 312 313 314  | Next Page >