Search Results

Search found 46908 results on 1877 pages for 'managing files and folder'.

Page 84/1877 | < Previous Page | 80 81 82 83 84 85 86 87 88 89 90 91  | Next Page >

  • List and Read Text Files

    This sample takes files from a folder (in this situation, text files), and lists them in a listbox. Then, the user can click on a particular file, and display the text from the selected text file, in the textbox to the right of the listbox.

    Read the article

  • dun goofed the files in /usr/lib/x86_64-linux-gnu/

    - by tipu
    there was some weird package issue with (in my limited understanding) 32/64 bit libraries, so i went around making symlinks to the file my lamp installation expected to the ones that actually existed. i did this for a # of files in here: /usr/lib/x86_64-linux-gnu/ however php still ended up not working (separate issue) and now i believe i have a screwed up lib directory. is there a way to revert those library files?

    Read the article

  • apt-get -f install removed software center and several other files

    - by user287858
    I ran sudo apt-get -f install and several files and programs were removed including software center. Is there a way to re-download everything as if ubuntu was new again without a cd? This computer does not have a cd-rom drive. I'd be fine with losing all the data on this computer. Also, when I run sudo apt-get install (almost anything) I get errors about dependencies and files not being available. Thanks to anyone who can help.

    Read the article

  • Check SQL Server Virtual Log Files Using PowerShell

    In a previous tip on Monitor Your SQL Server Virtual Log Files with Policy Based Management, we have seen how we can use Policy Based Management to monitor the number of virtual log files (VLFs) in our SQL Server databases. However, even with that most of the solutions I see online involve the creation of temporary tables and/or a combination of using cursors to get the total number of VLFs in a transaction log file. Is there a much easier solution?

    Read the article

  • Moving files fails due to privileges but they seem to be OK

    - by joaoc
    I am trying to copy old files from an OS X 10.5.8 to a new external HD. When trying to copy a folder I get the message: The operation cannot be completed because you do not have sufficient privileges for some of the items. I've checked the privileges and they seem ok (read & write for me). What is curious is that the folder is created empty on the target drive and I can then copy the contents from inside the original folder to inside the new folder without changing anything else. This happens with several folders but not all and is making backup a pain. I have to figure out which folder broke the copy, copy its contents to the external disk and then select what hasn't been copied to copy (and eventually stop at some point and repeating the experience)

    Read the article

  • .AVI Files randomly cease to open, other strange errors too

    - by Ben Franchuk
    I Recently (a couple weeks ago) downloaded the complete series of Seinfeld, all in varying file type. I Watched them in sequence according to season and to airing date, and all was well. All of the files played fine with my media player of choice ("BS Player"), and once I had finished, I went onto watch some other TV I had previously downloaded (The U.S. Series of "The Office"), and after then, some other film and then some music, over the following weeks (keep in mind all of these files are all on the same Hard Drive). Later then, More recently, I Went back to watching Seinfeld. The episodes played well as they did before- with the exclusion of a few in Season 7. I Have not tested all of the episodes in the season, but upon inspection, the majority of them are experiencing this problem; the problem being simply that they don't open! BS Player says that the files are either damaged or that the codecs to play the files are not on my computer-- however I am certain that the files DO have the codecs, and I am pretty sure that they are NOT DAMAGED either. I Have played the files with other players (such as VLC, Media Player Classic, and Windows Media Player), too, only to the same result; of them not opening. Seemingly the only way that I can differentiate between a damaged file and a non-damaged file are the way that the icon shows in Windows Explorer. For example, the below image is how explorer shows the information of a file that is non-damaged... ...and below is how a damaged file appears... The most disturbing and confusing part of this, though, is the last episode in the season- It opens, but not as a video- Instead, as a 1 Hour, 16 Minute, and 35 Second Audio file! The file plays a song for the first 4 or so minutes, and then is pretty much silent (except for some extremely quiet noise) until the last minute or so, when a random array of chopped up sounds and beeping noises play. I Do not recognise the song at the beginning of the file, but by the sounds of it, it is a song by the artist "Mr. Oizo," who's complete works I downloaded a couple weeks before now; and a bit before then I had finished downloading season 9 (not affected by these problems) of Seinfeld. I'd also like to note that the file I told of earlier (which played audio instead of video) reads as the same size as the other files in the season (around 175 MB) and also opens as a video clip. I Have NEVER experienced any of these problems in the past, and they seem to be only effecting the one season of my downloaded TV. The problems have not arisen with any of the other files on my Hard Drive, or any of the files downloaded around the time or after the time of which I downloaded season 7 of Seinfeld- or at least to my noticing. I Use the hard drive these files are located on almost every day, so could that be the cause of these problems? Is this a sign that my HDD is soon going to die? If it helps, the HDD is a Western Digital MyBook 1.5 TB 7500 RPM. It is connected to the computer via U.S.B. 2.0. EDIT! I noticed that this problem is now occurring with Season 9 of Seinfeld- and, presumably, other files on the drive I have yet to check. Please, If you have ANY IDEA AT ALL on what may be causing this or how to fix it, do tell me!

    Read the article

  • Use DOS batch to move all files up 1 directory

    - by Harminoff
    I have created a batch file to be executed through the right-click menu in Win7. When I right-click on a folder, I would like the batch file to move all files (excluding folders) up 1 directory. I have this so far: PUSHHD %1 MOVE "%1\*.*" ..\ This seems to work as long as the folder I'm moving files from doesn't have any spaces. When the folder does have spaces, I get an error message: "The syntax of the command is incorrect." So my batch works on a folder titled PULLTEST but not on a folder titled PULL TEST. Again, I don't need it to move folders, just files. And I would like it to work in any directory on any drive. There will be no specific directories that I will be working in. It will be random. Below is the registry file I made if needed for reference. Windows Registry Editor Version 5.00 [HKEY_CLASSES_ROOT\Directory\shell\PullFiles] @="PullFilesUP" [HKEY_CLASSES_ROOT\Directory\shell\PullFiles\command] @="\"C:\\Program Files\\MyBatchs\\PullFiles.bat\" \"%1\""

    Read the article

  • How are Linux files and applications organized?

    - by doup
    Hi there, I'm a newbie Linux (Ubuntu) user and I'll like to know if someone can give some advices of where to install stuff, which folders don't touch, which is the meaning of each folder and so on. My first concern is, should everything go into my home folder? I've installed "manually" Komodo Edit (it's an IDE) and it has gone to my home folder, I really don't like the idea of having an application there. (in windows I used to have my workfiles/pictures/downloads... partition and then the OS partition with all the apps). So, is there any place where I could install this software? Any advice for having my home folder ordered? Maybe I should create an apps folder in my home dir? Thanks in advance. :) pd: most of the time I use apt to install stuff, but I don't always found the software I want there...

    Read the article

  • Is there a way to know what the Windows Disk Cleanup utility will delete?

    - by Cam Jackson
    When I run the Disk Cleanup utility that's built into Windows 8, it tells me that it can free up 53GB by deleting 'Temporary Files'. However, a CCleaner analysis on default settings only finds about 300MB worth of space to free up, so I'm wondering what Disk Cleanup has found that CCleaner does not. Note that this question appears to be similar to what I'm asking, but the accepted answer says that 'Temporary Files' refers to %TEMP%. I've already cleared out most of C:\Users\Cam\AppData\Local\Temp, and it now has only 230MB of stuff in it, even with system files showing. So where is this 53GB located? Is there a way to find out what it is? Edit: I should note that this is on a 110GB SSD, so it's almost half the drive. And in fact I'm only using 86GB, so if it's really going to clear out 53GB, that would be more than 60% of the stuff on my C drive. I'm starting to think that Disk Cleanup caches its analysis, and hasn't updated since I started cleaning up the drive earlier today. Although when I run it it says that it's 'Calculating' how much space can be saved, and it takes about 5-10 seconds to do so. Hmmm... Edit2: Here is what my hard drive looks like, according to SpaceMonger (Right click-Open image in new tab, so you can see it properly): You can see why I was starting to think that the 53GB figure is actually wrong. Even if 'Temporary Files' includes my hiberfil and everything in WinSxS (about 13GB total), that would be 26GB, which is only halfway there. Hard to see where there's 53GB of stuff to delete.

    Read the article

  • arrays in puppet

    - by paweloque
    I'm wondering how to solve the following puppet problem: I want to create several files based on an array of strings. The complication is that I want to create multiple directories with the files: dir1/ fileA fileB dir2/ fileA fileB fileC The problem is that the file resource titles must be unique. So if I keep the file names in an array, I need to iterate over the array in a custom way to be able to postfix the file names with the directory name: $file_names = ['fileA', 'fileB'] $file_names_2 = [$file_names, 'fileC'] file {'dir1': ensure => directory } file {'dir2': ensure => directory } file { $file_names: path = 'dir1', ensure =>present, } file { $file_names_2: path = 'dir2', ensure =>present, } This wont work because the file resource titles clash. So I need to append e.g. the dir name to the file title, however, this will cause the array of files to be concatenated and not treated as multiple files... arghh.. file { "${file_names}-dir1": path = 'dir1', ensure =>present, } file { "${file_names_2}-dir2": path = 'dir1', ensure =>present, } How to solve this problem without the necessity of repeating the file resource itself. Thanks

    Read the article

  • Orca: extracting files from merge module

    - by Mystagogue
    All I want is a command-line tool that can extract files from a merge module (.msm) onto disk. I looked up Orca (version 3.1), whose documentation states: Many merge module options can be specified from the command line... Extracting Files from a Merge Module Orca supports three different methods for extracting files contained in a merge module. Orca can extract the individual CAB file, extract the files into a module tree and extract the files into a source image once it has been merged into a target database... Extracting Files To extract the individual files from a merge module, use the ... -x ... option on the command line, where is the desired path to the new directory tree. The specified path is used as the root path for the extracted files. All files are extracted from the CAB file embedded in the module and placed in the specified path. The directory layout for the extracted files is based on the directory tree of the merge module. It mostly sounds like exactly what I need. But when I try it, orca simply opens up an editor (with info on the msm I specified) and then does nothing. I've tried a variety of command lines: orca -x theDirectory theModule.msm orca theModule.msm -x theDirectory ...and others. I get nowhere. The closest I've gotten was this: orca -q -x theDirectory -m theModule.msm ...but then it complains that I didn't specifiy a database to merge into. But I'm not trying to merge anything, no less into a database. I just want the files extracted. Can someone explain what I'm doing wrong with the command line options?

    Read the article

  • PHP count total files in directory AND subdirectory function

    - by Neoweiter
    I need to get a total count of JPG files within a specified directory, including ALL it's subdirectories. No sub-sub directories. Structure looks like this : dir1/ 2 files subdir 1/ 8 files total dir1 = 10 files dir2/ 5 files subdir 1/ 2 files subdir 2/ 8 files total dir2 = 15 files I have this function, which doesn't work fine as it only counts files in the last subdirectory, and total is 2x more than the actual amount of files. (will output 80 if I have 40 files in the last subdir) public function count_files($path) { global $file_count; $file_count = 0; $dir = opendir($path); if (!$dir) return -1; while ($file = readdir($dir)) : if ($file == '.' || $file == '..') continue; if (is_dir($path . $file)) : $file_count += $this->count_files($path . "/" . $file); else : $file_count++; endif; endwhile; closedir($dir); return $file_count; }

    Read the article

  • bash: listing files in date order, with spaces in filenames

    - by Jason Judge
    I am starting with a file containing a list of hundreds of files (full paths) in a random order. I would like to list the details of the ten latest files in that list. This is my naive attempt: ls -las -t `cat list-of-files.txt` | head -10 That works, so long as none of the files have spaces in, but fails if they do as those files are split up at the spaces and treated as separate files. I have tried quoting the files in the original list-of-files file, but the here-document still splits the files up at the spaces in the filenames. The only way I can think of doing this, is to ls each file individually (using xargs perhaps) and create an intermediate file with the file listings and the date in a sortable order as the first field in each line, then sort that intermediate file. However, that feels a bit cumbersome and inefficient (hundreds of ls commands rather than one or two). But that may be the only way to do it? Is there any way to pass "ls" a list of files to process, where those files could contain spaces - it seems like it should be simple, but I'm stumped.

    Read the article

  • Reading from a database located in the Program Files folder using ODBC

    - by Dabblernl
    We have an application that stores its database files in a subfolder of the Program Files directory. These files are redirected to the VirtualStore in Vista and Windows 7. We represent data from the database using Microsoft DataReports (VB6). So far so good. But we now want to use Crystal Reports XI to represent data from the database. Our idea is to NOT pass this data to CR from our program, but to have CR retreive it from the database using a a system DSN through ODBC. In this way we hope to present our users with more flexibility in designing their own reports. What we do want to ensure though is that these system DSNs are configured correctly when the user installs our program or when the program calls the Crystal Report. Is there a smart way to do this using System variables for instance, instead of having to write a routine that checks for OS-version, whether UAC is enabled on the OS, whether the write restrictions on the Program Files folder have been lifted, etc and then adapts he System DSN to point to either the C:\Program Files\OurApp\Data folder, or the C:\Users\User\AppData\VirtualStore\Program Files\OurApp\Data folder? Suggestions for an entirely different approach are welcome too!

    Read the article

  • AnkhSVN: Cannot checkout Subsolution due to existing "versioned" folder

    - by lostiniceland
    Hello Everyone I am using Subversion since quite some time for Java-Development and I have setup a repository on my local NAS. Since I have a MSDN subscription via my company I recently installed Visual Studio 2010 to do a small project with .NET. According to some "best-practices" my project folder looks like the following. MySolution main.sln Services services.sln Service A files Service A Test files View projectfiles Persistence persistence.sln PersistenceXml files PersistenceXml Test files PersistenceDB files PersistenceDB Test files The idea is, that the main.sln only contains the projects for the application, meaning no test projects. The subsolutions, contain the project(s) and their corresponding testprojects. I was able to put all those projects under versioncontrol with AnkhSVN, so I have the same structure there in my trunk. Commiting changes was also no problem. Now I would like to check the this out on another machine. I was able to check out the main.sln which downloaded everything that was inside this solution. It skipped the services.sln, persistence.sln and all the test-projects. Until now everything is fine. Now, here comes the problem: when I am tryting to check out the subsolution (eg. services.sln) I get an error, I think it was UnsupportedOperation. I guess this happens because ankhsvn is tryting to download the folder Service A again and create ist hidden .svn folder which is already present. The only workaround I can think of by now is installing Tortoise SVN and check out the whole thing at once. It would be nicer though to have everything from within VS. Does anyone know how I can solve this? Is another client the only solution?

    Read the article

  • Is there a best practice for concatenating MP3 Files, adjusting sample rates to match, while preserving original files?

    - by Scott
    Hello overflow community! Does anyone know if there is a "best practice" to concatenate mp3 files to create new files, while preserving the original files? I am working on a CentOS Linux machine, in command line. I will eventually call the command line from a PHP script. I have been doing research and I have come up with a process that I think could work. It combines general advice from different forums, blogs, and sources like this one. So here I go: Create a temporary folder Loop through files to create a new, converted copy, of file into a "raw" format (which one, I don't know. I didn't know "raw" files existed before too long ago. I could use some suggestions on this) Store the path to the temporary files, in the temporary folder, and then loop through the files to concatenate them and then put the new merged file the final "processed directory" Delete the contents of the temporary file with the temporary raw files inside. Convert the final file from "raw" to mp3 and enjoy the finished result I'm thinking that this course of action might be best because I can't necessarily control the quality of the original "source" mp3s. The only other option I could think of would be to create a script that would perform a similar process upon files being added to the system leaving only the files with the "proper" format and removing the original "erroneous" file. Hopefully you can see that I have put some thought into this and that I'm trying to leverage the collective knowledge of this community to choose the best direction. Perhaps there is a better path that I could take? By concatenate, I mean to join together in sequence to create a new audio file from the "concatenated files."

    Read the article

  • RoboCopy Log File Analysis

    - by BobJim
    Is it possible to analyse the log text file outputted from RoboCopy and extract the lines which are defined as "New Dir" and "Extra Dir"? I would like the line from the log contain all the details returned regarding this "New Dir" or "Extra Dir" The reason for completing this task is to understand how two folder structures have change over time. One version has been kept internally at the parent company, the second has been used by a consultancy. For your information I am using Windows 7.

    Read the article

  • Permissions problem when copying files to /usr/share/tomcat6

    - by Nazar Hussain S
    Hi, I am running springsource framework in ubuntu 10.01. In my home folder, I have installed springsource IDE. I have my tomcat6 appserver in the /usr/share/tomcat6. While executing a sample project springapp, I have created the springapp dir in /users/share/tomcat6/webapps/ folder using sudo as I am unable to do it directly. On running the ant deploy or ant deploywar command, I am unable to copy the sample file -index.jsp from my workspace in springsource IDE to springapp dir in /usr/share/tomcat6/webapps as I am getting the error permission denied while copying the .jsp file. Can anybody please provide suggestion as to how to overcome this issue? Regards

    Read the article

  • How to FTP transfer files to /var/www?

    - by jc.yin
    I'm new to linux and I've set up a web server with Ubuntu Desktop edition so I can practice with the GUI a bit before transitioning to Ubuntu Server. I've already set up a LAMP stack as well as FTP. Now I just need to know how to transfer my web files to the /var/www folder in Ubuntu. Previously I've worked on Mac OS and there's a central server for all the web files where I can FTP to. Anyone able to help me understand how do I FTP to the /var/www folder in Ubuntu? Thanks

    Read the article

  • How to Use the File History Feature in Windows 8 to Restore Files

    - by Taylor Gibb
    Jealous of your Mac OS X friends and their great Time Machine feature? Windows 8 has a new feature called File History that works much the same way, giving you an easy method to restore previous versions of your files. We are going to use a networked folder in for our article but you could always skip creating the network folder, and just use a USB drive. To use a USB drive you can just go to the setting for File History and turn it on, it should automatically find your USB and immediately start working. How to Sync Your Media Across Your Entire House with XBMC How to Own Your Own Website (Even If You Can’t Build One) Pt 2 How to Own Your Own Website (Even If You Can’t Build One) Pt 1

    Read the article

  • How to move files over samba share with gnomevfs cli

    - by Allan
    Ok I am in the process of backing up my film collection to a NAS and I wanted to automate this as much as possible as I have to work at the same time. I am trying to setup a daily dump of ISO's ready to be converted overnight. I would like to do this as a cron job using gnomevfs. I have been able to connect and do an ls command successfully with gnomevfs-ls smb://user:WORKGROUP:password@media-centre/videos/ but I am having trouble setting up a mv command from a local folder to the same shared folder keep getting the Usage: gnomevfs-mv <from> <to> quote which isn't particularly informative ;) any ideas?

    Read the article

  • Ubuntu One Windows application only accessing gpg files

    - by Boomer Kuwanger
    I'm on a windows 7 (64) machine right now, I have the Ubuntu One windows application. I'm synced to my online account, the folder I am accessing is... deja-dup\My-desktop When I click 'Sync Locally' checkbox and explore my folder I am only able to see three files of the form duplicity-full.#######.manifest.gpg duplicity-full.#######.vol1.difftar.gpg duplicity-full-signatures.#######.sigtar.gpg How do I access the content of these file? I put them on a linux server and decrypted them/ extracted them, however something is wrong. Note: I cannot use apt-get on the linux server I'm using. Is there a way to access these files using the Ubuntu one software for windows? Many Thanks, Boomerkuwanger

    Read the article

  • Upgraded to 11.10 lost personal folders, Ubuntu one shows no files

    - by Kevin
    Upgraded to 11.04, from 10.10 system would only come up in terminal mode, but it told me that an additional upgrade was available and did I want to do that. Foolishly thinking that might fix the problem, I said yes. This time it did not make it all the way through the upgrade, when I came back to the computer over an hour later, the screen was filled with an error message "could not open display", had to reboot. Went to recovery mode on reboot to install nvidia module, when I rebooted system came up fine, but without carrying over my personal folders, I have the home folder, but no personal named folder in it. Came to Ubuntu One, but gives error message; File Sync error. (org.freedesktop.DBus.Error.NoReply: Did not receive a reply. Possible causes include: the remote application did not send a reply, the message bus security policy blocked Is the a way around this in order to restore my files? I know my files existed on Ubuntu one as of a few months ago.

    Read the article

< Previous Page | 80 81 82 83 84 85 86 87 88 89 90 91  | Next Page >