Search Results

Search found 77950 results on 3118 pages for 'large file upload'.

Page 290/3118 | < Previous Page | 286 287 288 289 290 291 292 293 294 295 296 297  | Next Page >

  • Restoring permissions on Windows 2008

    - by Andrey
    I have played with folder permissions due to SVN not being able to write to a folder and now I got into a state where I go to any folder of C: drive in Windows Explorer and when I right-click it takes 30 seconds to show the context menu and it just hangs the window after that. It definitely has something to do with permissions as it was all working fine until I started tweaking permissions about an hour ago. My login belongs to two groups Users and Administrators. I changed ownership of C drive to Administrators group and I think it screwed everything, but I can't change it back because I don't even remember what it was :) Oh, and only Administrators group has access to drive C now. Any way to reset permissions to some previous state or some workable state?

    Read the article

  • chmod -R 777 /. - RHEL 5.5

    - by user1263746
    A shell script testing went bad and it issued chmod -R 777 /. to the system, instead of chmod -R 777 ./ and as expected it wiped the critical meta data. We have turned off the system and it will not function properly the next time it is turned on. I am told that rpm --setperms -a rpm --setugids -a should atleast fix the permission of the packages maintained by rpm. Is it worth doing? And is there any script available which will copy the permission from an identical system? To atleast get the box working. The Box is running RHEL5.5 Thanks!

    Read the article

  • Is it possible to have nested libraries in Windows 7?

    - by dr_draik
    My goal is this: I have a library, say it's called Series. I store my series in two different places, one for watched episodes and one for unwatched episodes. Obviously I can simply add the root folder of each location to a series library. What I would prefer to do is have a sub-library within Series for each series, for example: Series \ Lost Lost (Unwatched series) Episode 3 Episode 4 Lost (Watched series) Episode 1 Episode 2 Is there a way to achieve this, or something approximating this (without having a full library for each series)? P.S. I've read the other topic, but I was wondering if there was a possible workaround for this specific need. More out of hope than anything else. ;)

    Read the article

  • chrooted sftp user with write permissions to /var/www

    - by matthew
    I am getting confused about this setup that I am trying to deploy. I hope someone of you folks can lend me a hand: much much appreciated. Background info Server is Debian 6.0, ext3, with Apache2/SSL and Nginx at the front as reverse proxy. I need to provide sftp access to the Apache root directory (/var/www), making sure that the sftp user is chrooted to that path with RWX permissions. All this without modifying any default permission in /var/www. drwxr-xr-x 9 root root 4096 Nov 4 22:46 www Inside /var/www -rw-r----- 1 www-data www-data 177 Mar 11 2012 file1 drwxr-x--- 6 www-data www-data 4096 Sep 10 2012 dir1 drwxr-xr-x 7 www-data www-data 4096 Sep 28 2012 dir2 -rw------- 1 root root 19 Apr 6 2012 file2 -rw------- 1 root root 3548528 Sep 28 2012 file3 drwxr-x--- 6 www-data www-data 4096 Aug 22 00:11 dir3 drwxr-x--- 5 www-data www-data 4096 Jul 15 2012 dir4 drwxr-x--- 2 www-data www-data 536576 Nov 24 2012 dir5 drwxr-x--- 2 www-data www-data 4096 Nov 5 00:00 dir6 drwxr-x--- 2 www-data www-data 4096 Nov 4 13:24 dir7 What I have tried created a new group secureftp created a new sftp user, joined to secureftp and www-data groups also with nologin shell. Homedir is / edited sshd_config with Subsystem sftp internal-sftp AllowTcpForwarding no Match Group <secureftp> ChrootDirectory /var/www ForceCommand internal-sftp I can login with the sftp user, list files but no write action is allowed. Sftp user is in the www-data group but permissions in /var/www are read/read+x for the group bit so... It doesn't work. I've also tried with ACL, but as I apply ACL RWX permissions for the sftp user to /var/www (dirs and files recursively), it will change the unix permissions as well which is what I don't want. What can I do here? I was thinking I could enable the user www-data to login as sftp, so that it'll be able to modify files/dirs that www-data owns in /var/www. But for some reason I think this would be a stupid move securitywise.

    Read the article

  • if I set up the expire http header of a css file to 1 year, if I modify that file, will it be ignore

    - by user39511
    I'm using rails with nginx/passenger. If I set up the expire http header of a css file to 1 year, if I modify that file, will it be ignored by the browser (ie, it will not request the new version)? Given that Rails adds a different timestamps to each asset such as foo.css?1270165626 every time I restart the server? That's the config I use right now (nginx/passenger): location ~* \.(ico|css|js|gif|jpe?g|png)(\?[0-9]+)?$ { expires max; break; }

    Read the article

  • Spaces in SETX PATH command

    - by Jeremy Stein
    Suppose my PATH is C:\WINDOWS\system32\;C:\Program Files\Important\ SET NEW_PATH=C:\My\Dir\ SETX PATH "%PATH%;%NEW_PATH%" Results in a path of: C:\WINDOWS\system32\;C:\Program Files\Important\;C:\My\Dir" Notice the quotation mark at the end of the path. It's as though the backslash at the end of %NEW_PATH% escaped the final quote mark. I need the quotation marks because I have spaces in my path, but I don't want backslashes to be interpreted as escape characters. What's the right way to include my PATH in the call to SETX?

    Read the article

  • How to copy with cp to include hidden files and hidden directories and their contents?

    - by eleven81
    How can I make cp -r copy absolutely all of the files and directories in a directory Requirements: Include hidden files and hidden directories. Be one single command with an flag to include the above. Not need to rely on pattern matching at all. My ugly, but working, hack is: cp -r /etc/skel/* /home/user cp -r /etc/skel/.[^.]* /home/user How can I do this all in one command without the pattern matching? What flag do I need to use?

    Read the article

  • Seasgate GoFlex NAS + Horrible Speed = Bad Experience

    - by Jon H
    I am having issues with transfer speeds from my desktop PC to my NAS. I have my NAS hooked up to a Gigabit Gateway as well as my Desktop with Cat 5e. I see up to 4.0 MB/Second Transfer Rates, the normal is about 2.5 MB/Seconds. There is 3 Partitions on my NAS, Public, Private, Backup. When I transfer from Private to Public I see the speeds above. If its under the same partition almost instant. I was wondering if the speeds I am seeing is in due to my Computer or the NAS. I was looking into building my own Media Server in due to these horrible speeds. Is their anything I can do in the mean time to speed this up? Motherboard = M3970AM-HP (Angelica) Processor = AMD FX 6100 Ram = 10GB PC3-10600 MB/sec Hard Drive (1) = 1.5TB SATA 3.0GB 5400RPM Hard Drive (2) = 120GB SATA SSD NAS = Seagate 3TB Go Flex Home Connection (1) = 1000 Base T Connection (2) = Wireless N

    Read the article

  • Name of log file where boot process is logged

    - by ant2009
    Hello, CentOS 5.3 After booting up. I am wondering what is the name of the log file that contains if all services where successfully loaded or not? For example when computer boots you get a list of start services and they can be OK or FAILED. Is there a log file where this information is kept? I had a look in the following directory /var/log/ but not sure which one will contain the informaiton that I need. Many thanks for any advice,

    Read the article

  • Mystery 0xc0000142 error on starting java from a service, as a different user.

    - by cpf
    This is a very convoluted setup, but effectively this is what goes down: Manager service (which I don't have control over) running as admin user X starts my executable, which then starts Java as user Y using the standard c# StartInfo.Username/Password controls. Now, from a basic (not elevated or anything, just admin) command prompt I can run that executable, and Java pops up and works fine, running perfectly under the user it should be. When the service runs the same executable, however, Java silently fails. The only hint I see is this series of events in the event viewer: Service starts "Application popup: java.exe - Application Error : The application was unable to start correctly (0xc0000142). Click OK to close the application. " (googling this reveals a lot of scam sites telling me to use their "free antivirus to fix 0xc0000142 errors easy!"... sigh) Service stops (the java shutdown propagated, which is supposed to happen) And here's what process explorer has for the failure: As you can see, everything shows as a success. Now, I think this might have something to do with the permissions (the user java.exe is running under has traverse permission for the entire drive and full permissions to Directory A, which is where the .jar is), but I just can't fathom how something that works fine from the command line (and, this is an upgrade, the previous system without the user-switching aspect works fine from the service) can fail with such a cryptic message and little showing up in logs.

    Read the article

  • Why is pdf format used?

    - by dan_vitch
    I will admit that I am new to the tech/dev field. It seems to become a trend that every time I have to work with pdfs a part of me dies. Why is this format as ubiquitous as it seems to be? Is it just non-tech people that prefer pdfs?

    Read the article

  • How do large blobs affect SQL delete performance, and how can I mitigate the impact?

    - by Max Pollack
    I'm currently experiencing a strange issue that my understanding of SQL Server doesn't quite mesh with. We use SQL as our file storage for our internal storage service, and our database has about half a million rows in it. Most of the files (86%) are 1mb or under, but even on fresh copies of our database where we simply populate the table with data for the purposes of a test, it appears that rows with large amounts of data stored in a BLOB frequently cause timeouts when our SQL Server is under load. My understanding of how SQL Server deletes rows is that it's a garbage collection process, i.e. the row is marked as a ghost and the row is later deleted by the ghost cleanup process after the changes are copied to the transaction log. This suggests to me that regardless of the size of the data in the blob, row deletion should be close to instantaneous. However when deleting these rows we are definitely experiencing large numbers of timeouts and astoundingly low performance. In our test data set, its files over 30mb that cause this issue. This is an edge case, we don't frequently encounter these, and even though we're looking into SQL filestream as a solution to some of our problems, we're trying to narrow down where these issues are originating from. We ARE performing our deletes inside of a transaction. We're also performing updates to metadata such as file size stats, but these exist in a separate table away from the file data itself. Hierarchy data is stored in the table that contains the file information. Really, in the end it's not so much what we're doing around the deletes that matters, we just can't find any references to low delete performance on rows that contain a large amount of data in a BLOB. We are trying to determine if this is even an avenue worth exploring, or if it has to be one of our processes around the delete that's causing the issue. Are there any situations in which this could occur? Is it common for a database server to come to the point of complete timeouts when many of these deletes are occurring simultaneously? Is there a way to combat this issue if it exists? (cross-posted from StackOverflow )

    Read the article

  • Copying windows 8 Users folder having long long paths

    - by bilal.haider
    I was trying to move my "Users" folder in Windows 8 as described here and here. But when I try to copy the folder using "xcopy" in Windows Installation Disk Repair Mode, after some files are copied, I get "insufficient memory". The files on which the error is given are like C:\Users\Bilal\Application Data\Application Data\Application Data.........Application Data\Application Data..... What is the point in such directories within directories? I also tried copying them using Mini Windows XP, but the problem was there too.. Also tried copying using Parted Magic Live CD... but still.. So now, how can I move them? Another Question. Is moving such/ system files using Linux a good idea? Does it do anything to permissions?

    Read the article

  • Windows 7, files reappear after deletion.

    - by HeavyWave
    I'm trying to delete some files from a folder. I've taken ownership of the files and the folder. When I delete these files Windows doesn't report any errors and deletes them. BUT, after I press F5 these files reappear again. There are no messages whatsoever, they are just undeletable. I know login off will help, but how do I fix it without going through the pain of closing everything down? P.S. Files disappear from the folder after aprox. 5 minutes. Update. Turns out my version of Windows did not properly upgrade from test version, so it had some weird disk drive issues.

    Read the article

  • avoiding console window display when scheduled task runs batch file

    - by cherouvim
    Hello I have a small batch file which xcopies some files from one folder to another which I've scheduled (via windows scheduled tasks) to run every 1 hour: @echo off xcopy c:\foo c:\bar /E /C /F /Y Since this is my workstation, I'm most probably doing work when the task executes, and then the black dos console window is displayed (lasts 2-3 seconds) and steals window focus. I don't wish to see the files copied and of course the batch file does not ask for any user input. Is there a way to avoid displaying the console completely? thanks

    Read the article

  • Need an excel macro to produce a formatted text file

    - by user139238
    I am just learning how to make macros and I found a macro that nearly does what I need it to do, which is output a text file from Excel. What I need it to do is output this in a .mhd format, which I have done, and then take all the data written in the #fnum cells and place a return after each in the Excel file. Essentially I just need all the data to have their a specific line in the text file. I am certain there is an elegant way to go about this, but I can't seem to get it. Sub CreateFile() Do While Not IsEmpty(ActiveCell.Offset(0, 1)) MyFile = ActiveCell.Value & ".mhd" 'set and open file for output fnum = FreeFile() Open MyFile For Output As fnum 'use Print when you want the string without quotation marks Print #fnum, ActiveCell.Offset(0, 5); " " & ActiveCell.Offset(0, 6); " " & ActiveCell.Offset(0, 7); " " & ActiveCell.Offset(0, 8); " " & ActiveCell.Offset(0, 9); " " & ActiveCell.Offset(0, 10); " " & ActiveCell.Offset(0, 11); " " & ActiveCell.Offset(0, 12); " " & ActiveCell.Offset(0, 13); " " & ActiveCell.Offset(0, 14); " " & ActiveCell.Offset(0, 15); " " & ActiveCell.Offset(0, 16); " " & ActiveCell.Offset(0, 17); " " & ActiveCell.Offset(0, 18); " " & ActiveCell.Offset(0, 19); " " & ActiveCell.Offset(0, 20); " " & ActiveCell.Offset(0, 21); " " & ActiveCell.Offset(0, 22); " " & ActiveCell.Offset(0, 23); " " & ActiveCell.Offset(0, 24); " " & ActiveCell.Offset(0, 25); " " & ActiveCell.Offset(0, 26) Close #fnum ActiveCell.Offset(1, 0).Select Loop End Sub

    Read the article

  • Permanently deleting files on Mac OS

    - by Jonik
    A while back, as relatively new Mac OS X user, I was surprised to learn that you cannot easily delete files. Directly, that is, without moving them to the trash first. On Windows and Linux this can obviously be done with ease, but not so on the Mac. I noticed this when trying clear up files from a USB memory stick — removing the files ("move to trash") does not free up space; that happens only after emptying the whole system-wide Trash. Not particularly convenient! (It seems stupid to have to empty the whole trashcan just to make some space on the USB stick. There might be gigabytes of stuff in there, and this sort of defeats its purpose - what if you'd actually need to restore something from the trash some day.) So, what's your way of getting around this? Have you bought a 3rd party application like RAW Trash for $16.95 just to delete files, or do you diligently empty the trashcan whenever needed? Or did I miss something? Also, can you convince me that this is actually the way it should be — that users shouldn't be able to fiddle with the filesystem easily? :)

    Read the article

  • Replace, not merge, folders when copying

    - by Adam Kane
    In Windows Vista, when I try to move a folder to replace an existing folder, Vista asks me if I want to merge the folders or cancel. How do I get the old XP behavior where the old folder is just replaced. This issue is especially challenging when moving many folders at once and some of them move but then some of them hit this merge/cancel problem. Thanks!

    Read the article

  • Sharing folder in a Virtual Private Windows Server 2008 R2 ?

    - by Triztian
    See Edit 2: Hello all, seems my involvement with computers has grown and I've found my self in the need to access a shared folder on a server. I've read some documentation and managed to set up the folder as a share, for this I created a local group and for now just one local user that has access to the share, the folder is in the public user folder and it's permissions should be (and I believe they are) read/write. The problem is that I can't connect from a remote machine I mean I don't know how the way it should be accessed, the server has a public IP and we use it also as a host to our website I don't know if that affects it though, the folder will be used as the "keeper" for the QuickBooks company files and has the database server manager installed. I've tried setting up a VPN Connection to the but no success. The server has a domain name a "http://www.example.com" that redirects to our website, I am unsure if it could be accessed that way, also the share has a location displayed when I right-click properties Heres what I've tried Setting up a VPN Connection (Windows Vista and 7) Got to the point where I got asked for credential and entered the user I created (which is not an admin) but I got a "Connection fail error 800" I suppose this is because in the domain field I entered the servers workgroup. right-click add network connection (Windows 7) Went through the wizard until I reached the point of entering the location, tried many things, the name in the share's properties(\\SOMETHING\Share), the http://www.example.com , the IP address I'm quite unfamiliar with this, so I have my guesses: Since the group and user are local they do not have access to the folder. The firewall in the server is blocking my connection. Anyways, any help and guidence is truly appreciated. EDIT 1: As @tony roth pointed out it may be a security fail, an I commented it out to management and said that that is not an issue, so please bare with me. EDIT 2: I've found out that the real question could be streamlined to "Sharing folder in a Virtual Private Server?", as thats what we have, a virtual private windows server 2008 R2, and I would like to know how to make it show like a normal folder in the client computer. Thanks again for all of your support.

    Read the article

  • How to batch rename files copied from OSX to Windows with ':' in filenames?

    - by tputkonen
    This is really puzzling. I have lots of videos that were stored using Mac OS, and now I have to edit them on Windows XP. I copied files using HFSExplorer. Editing software refuses to open the files with their current names, and so far I have not found a way to batch rename all the files. Names of the files look like this: clip-2009-10-01 21;26;00.mov But I suspect in OSX the time was 21:26:00. I would like to replace the space with an underscore, and semicolons with dash. I've tried several bulk rename applications, with ; and :, but in vain. Also I've tried rename.pl, but also in vain.

    Read the article

  • Windows 7 shows a drive as full in summary but files shown on drive are very small

    - by Rob
    I have a drive partitioned so it is seen by Windows as 2 drives: C:\ and D:\ Windows 7 shows D:\ as full up in the graphical summary in 'My Computer' summary of all the drives, e.g. the bar graph indicates full and nearly all of the drive's capacity, 108Gb, is full. So I go into the D:\ drive to look at the files, I see several folders. I select them all and the right-click menu Properties to count their size, expecting the value to be about the same as what Windows reports in the summary, i.e. nearly 108Gb. But the properties window shows the files are very small, Kbs and Mbs, nowhere near 108Gbs. One of the folders is a backup, but its size is very small. I've checked the folder options to show all system files and hidden files too - and counted these in the properties. Something invisible is holding the space. What is happening here? I'm afraid to delete anything if it removes valuable backups. Have I got huge backups here? Why can't I see them? How do I see them?

    Read the article

  • SQL Server tempdb size seems large, is this normal?

    - by Abe Miessler
    From what I understand the system database is used to hold temporary tables, intermediate results and other temporary information. On one of my database instances I have a tempdb that is seems very large (30GB). This database has not been modified (as in "last modified date" on the mdf file) in over a week. Is it normal to have the temp db remain that large for that long of a period? It seems to me that it should be updating fairly often and returning space that it is using fairly quickly... Am I way off here or is SQL Server doing something weird? FYI: This is a SharePoint 2010 database, not sure if that makes a difference.

    Read the article

< Previous Page | 286 287 288 289 290 291 292 293 294 295 296 297  | Next Page >