Search Results

Search found 37883 results on 1516 pages for 'sparse files'.

Page 97/1516 | < Previous Page | 93 94 95 96 97 98 99 100 101 102 103 104  | Next Page >

  • Need help manipulating WAV (RIFF) Files at a byte level

    - by Eric
    I'm writing an an application in C# that will record audio files (*.wav) and automatically tag and name them. Wave files are RIFF files (like AVI) which can contain meta data chunks in addition to the waveform data chunks. So now I'm trying to figure out how to read and write the RIFF meta data to and from recorded wave files. I'm using NAudio for recording the files, and asked on their forums as well on SO for way to read and write RIFF tags. While I received a number of good answers, none of the solutions allowed for reading and writing RIFF chunks as easily as I would like. But more importantly I have very little experience dealing with files at a byte level, and think this could be a good opportunity to learn. So now I want to try writing my own class(es) that can read in a RIFF file and allow meta data to be read, and written from the file. I've used streams in C#, but always with the entire stream at once. So now I'm little lost that I have to consider a file byte by byte. Specifically how would I go about removing or inserting bytes to and from the middle of a file? I've tried reading a file through a FileStream into a byte array (byte[]) as shown in the code below. System.IO.FileStream waveFileStream = System.IO.File.OpenRead(@"C:\sound.wav"); byte[] waveBytes = new byte[waveFileStream.Length]; waveFileStream.Read(waveBytes, 0, waveBytes.Length); And I could see through the Visual Studio debugger that the first four byte are the RIFF header of the file. But arrays are a pain to deal with when performing actions that change their size like inserting or removing values. So I was thinking I could then to the byte[] into a List like this. List<byte> list = waveBytes.ToList<byte>(); Which would make any manipulation of the file byte by byte a whole lot easier, but I'm worried I might be missing something like a class in the System.IO name-space that would make all this even easier. Am I on the right track, or is there a better way to do this? I should also mention that I'm not hugely concerned with performance, and would prefer not to deal with pointers or unsafe code blocks like this guy. If it helps at all here is a good article on the RIFF/WAV file format.

    Read the article

  • Tracking unique versions of files with hashes

    - by rwmnau
    I'm going to be tracking different versions of potentially millions of different files, and my intent is to hash them to determine I've already seen that particular version of the file. Currently, I'm only using MD5 (the product is still in development, so it's never dealt with millions of files yet), which is clearly not long enough to avoid collisions. However, here's my question - Am I more likely to avoid collisions if I hash the file using two different methods and store both hashes (say, SHA1 and MD5), or if I pick a single, longer hash (like SHA256) and rely on that alone? I know option 1 has 288 hash bits and option 2 has only 256, but assume my two choices are the same total hash length. Since I'm dealing with potentially millions of files (and multiple versions of those files over time), I'd like to do what I can to avoid collisions. However, CPU time isn't (completely) free, so I'm interested in how the community feels about the tradeoff - is adding more bits to my hash proportionally more expensive to compute, and are there any advantages to multiple different hashes as opposed to a single, longer hash, given an equal number of bits in both solutions?

    Read the article

  • Powershell: Read value from xml files

    - by DanielR
    I need some with help with PowerShell, please. It should be pretty easy: I have a list of subdirectories, with a xml file in each one. I want to open each xml file and print the value of one node. The node is always the same, as the xml files are actually project files (*.csproj) from Visual Studio. I already got the list of files: get-item ** \ *.csproj How do I proceed?

    Read the article

  • Editing tags in bunch of MP3 files?

    - by overtherainbow
    I have a bunch of files whose filename looks like "<Track> - <Title> - <Artist>.mp3". I'd like to rewrite their MP3 Title tag to be "<Track> - <Title>.mp3" so that each song is displayed correctly on my elcheapo MP3 player. Since I rarely edit MP3 files anyway, MP3 Tag Tools 1.2 Build 008 from 2003 was good enough, but I can't figure out how to do this with this application. I just tried MP3Tag 2.46, but couldn't figure out if it can do this (created a new Action, to no avail). I also tried TagScanner 5.1.558, without success. Does someone know of a good, free Windows application that can do this? Thank you.

    Read the article

  • How to batch rename files using bash

    - by Alex Popov
    I know there are lots of such questions, but I couldn't find one (or a combination of several), which describes the things I want to do. I think I need to use regular expressions, but I am not very good with that. I use zsh. I have a folder with files, which I want to rename: I want the files challenge1.rb, challenge2.rb, challenge3.rb, etc. to be renamed to c1.rb, c2.rb etc. Similarly task1.rb and similar must be renamed to t1.rb etc. sample_spec_c1.rb, sample_spec_c2.rb etc. must be renamed to c1_spec.rb, c2_spec.rb etc. So I guess I need some combination of regular expressions and iteration, but I don't know how to write the bash script.

    Read the article

  • Windows 8 - can't drag files from Explorer and drop on applications

    - by FerretallicA
    In Windows 8 I find I can't drag files to applications like I've been able to do for as long as I can remember. Example: Drag MP3s to Winamp Drag folder full of music to Winamp Drag videos to VLC Drag txt, reg etc files to Notepad I have tried various combinations of: Running Explorer as administrator Running drop target as administrator Taking ownership of drop target application's folder Taking ownership of Explorer Changing user account to administrator Create a new user account Lowering UAC level Disabling UAC in GUI Disabling UAC in registry Running Explorer folders in a separate thread This is the last straw if there's no known proper (ie non hacky compromise) fix for this. "Little" things like this combined are a productivity nightmare and if I have to relearn so much and configure so much to get basic things done with an OS I might as well just move to Linux once and for all.

    Read the article

  • Advantages of multiple SQL Server files with a single RAID array

    - by Dr Giles M
    Originally posted on stack overflow, but re-worded. Imagine the scenario : For a database I have RAID arrays R: (MDF) T: (transaction log) and of course shared transparent usage of X: (tempDB). I've been reading around and get the impression that if you are using RAID then adding multiple SQL Server NDF files sitting on R: within a filegroup won't yeild any more improvements. Of course, adding another raid array S: and putting an NDF file on that would. However, being a reasonably savvy software person, it's not unthinkable to hypothesise that, even for smaller MDFs sitting on one RAID array that SQL Server will perform growth and locking operations (for writes) on the MDF, so adding NDFs to the filegroup even if they sat on R: would distribute the locking operations and growth operations allowing more throughput? Or does the time taken to reconstruct the data from distributed filegroups outweigh the benefits of reduced locking? I'm also aware that the behaviour and benefits may be different for tables/indeces/log. Is there a good site that distinguishes the benefits of multiple files when RAID is already in place?

    Read the article

  • "Show hidden files" option is not working

    - by crazygamer
    OK, I know this is the very basic thing that goes with Windows, but I am asking it here in search of answer. I put my pendrive and it autoruns. This changed the show hidden files option off, I mean I am not able to see my hidden files as it is not applying the changes. What is the registry file that has modified? I have scanned my computer using 4 antivirus programs. BitDefender found and deleted something in temperary folder. The rest didn't showed anything. I have encountered this problem a few more times but this time I don't want to format it ;-)

    Read the article

  • Win 7 accessing large files uses 100% RAM

    - by user181276
    Running Win 7 64-bit SP1 with 8 GB RAM. I first noticed this problem when using the GUI to copy some large (5+ GB) files from one disk to another. What happens is the physical memory in use rises quite quickly to 100% and the system comes to a crawl. If I just start to access the file in a media player (it is a movie) the memory usage climbs up slowly but eventually reaches 100%. When copying the same files via XCOPY I do not have this problem. Using RAMMAP I see most of the memory usage is under "Mapped File" and is allocated under the "Active" column. If I select "Empty System Working Set" the RAM usage drops back down but then starts to climb back up. Any ideas on what I can check/test to eliminate this issue?

    Read the article

  • Basic video editor for AVCHD Lite files?

    - by davr
    I have a camera (Panasonic Lumix GF-1) that outputs "AVCHD Lite" files, 720p h264 in a MTS container. I saw this question that said Movie Maker in Windows 7 supports AVCHD...but I just tried, and unfortunately it does not support AVCHD Lite. Are there any free or inexpensive non-linear video editors (NLE) that can natively handle AVCHD Lite files, without requiring some 3rd party driver? If not, are there any 3rd party drivers that are especially stable? (In my experience they usually have some problems...I got AVCHD Lite loading into VirtualDub using a 3rd party plugin, but it's very slow and sometimes crashes, and seeking takes ages.

    Read the article

  • Restore more than 250 files using DPM 2010 and PowerShell

    - by toryan
    I've got what should be a fairly simple task: restore the following files from DPM: D:\inetpub\wwwroot\*\index.* I followed the instructions in this TechNet wiki and pretty much thought I had it. Unfortunately, the New-SearchOption commandlet can only return 250 results, and this search would generate way more results than that. So actually only the first 250 files were restored, which is no use to anybody. Does anyone know of any way to get around the 250 search results limit? I guess it would be possible to get the subdirectories of D:\inetpub\wwwroot and loop through them in turn, but I kind of want to keep this fairly simple as it is only for this task.

    Read the article

  • rsync error: some files/attrs were not transferred

    - by Daniel Ball
    Using rsync(ubuntu) and a DeltaCopy server on W2K3 to back up some of the data on the file server before I migrate from W2K3 to Ubuntu server. After it completed I ran a dry run just in case something had been missed or changed ... I got the following: sudo rsync -az -n 198.3.9.25::Music /mnt/raid/music [sudo] password for daniel: file has vanished: "?????\#267????" (in Music) file has vanished: "????????" (in Music) ... rsync error: some files/attrs were not transferred (see previous errors) (code 23) at main.c(1526) [generator=3.0.7] I just want to make sure I'm reading it right, that somehow there are files on the receiving end that aren't on the sending?

    Read the article

  • Using ServletOutputStream to write very large files in a Java servlet without memory issues

    - by Martin
    I am using IBM Websphere Application Server v6 and Java 1.4 and am trying to write large CSV files to the ServletOutputStream for a user to download. Files are ranging from a 50-750MB at the moment. The smaller files aren't causing too much of a problem but with the larger files it appears that it is being written into the heap which is then causing an OutOfMemory error and bringing down the entire server. These files can only be served out to authenticated users over https which is why I am serving them through a Servlet instead of just sticking them in Apache. The code I am using is (some fluff removed around this): resp.setHeader("Content-length", "" + fileLength); resp.setContentType("application/vnd.ms-excel"); resp.setHeader("Content-Disposition","attachment; filename=\"export.csv\""); FileInputStream inputStream = null; try { inputStream = new FileInputStream(path); byte[] buffer = new byte[1024]; int bytesRead = 0; do { bytesRead = inputStream.read(buffer, offset, buffer.length); resp.getOutputStream().write(buffer, 0, bytesRead); } while (bytesRead == buffer.length); resp.getOutputStream().flush(); } finally { if(inputStream != null) inputStream.close(); } The FileInputStream doesn't seem to be causing a problem as if I write to another file or just remove the write completly the memory usage doesn't appear to be a problem. What I am thinking is that the resp.getOutputStream().write is being stored in memory until the data can be sent through to the client. So the entire file might be read and stored in the resp.getOutputStream() causing my memory issues and crashing! I have tried Buffering these streams and also tried using Channels from java.nio, none of which seems to make any bit of difference to my memory issues. I have also flushed the outputstream once per iteration of the loop and after the loop, which didn't help.

    Read the article

  • SQL Server 2005 standard filegroups / files for performance on SAN

    - by Blootac
    Ok so I've just been on a SQL Server course and we discussed the usage scenarios of multiple filegroups and files when in use over local RAID and local disks but we didn't touch SAN scenarios so my question is as follows; I currently have a 250 gig database running on SQL Server 2005 where some tables have a huge number of writes and others are fairly static. The database and all objects reside in a single file group with a single data file. The log file is also on the same volume. My interpretation is that separate data files should be used across different disks to lessen disk contention and that file groups should be used for partitioning of data. However, with a SAN you obviously don't really have the same issue of disk contention that you do with a small RAID setup (or at least we don't at the moment), and standard edition doesn't support partitioning. So in order to improve parallelism what should I do? My understanding of various Microsoft publications is that if I increase the number of data files, separate threads can act across each file separately. Which leads me to the question how many files should I have. One per core? Should I be putting tables and indexes with high levels of activity in separate file groups, each with the same number of data files as we have cores? Thank you

    Read the article

  • Xcode custom project template creates only references to files

    - by user315374
    Hi, I tried to create a custom project template for setting up unit testing. The problem is that when i create a new project based on this template it creates references to the template files : When i edit a file, it changes my template files instead of my actual project files ! When i delete my template, files from my actual project becomes red ! The project template is in : /Developer/Platforms/iPhoneOS.platform/Developer/Library/Xcode/Project Templates/Application/Test-based Application Reading some question on stack overflow i tried to install my template project in /Library/Application Support/Developer/Shared/Xcode/Project Templates But i cannot see my template when i create a new project. Can anybody help ? Thanks, Vincent

    Read the article

  • LAMP/TURNKEY LINUX/VIRTUAL BOX: Manipulating Files on a Virtual Machine

    - by aeid
    I am running Ubuntu 9.10 and I want to install turnkey linux's LAMP server on my machine to test out my code. I installed Turnkey LAMP via VirtualBox and it seems to be working because I can access the http://localhost. My question is: How do I manipulate files via VirtualBox? For example, if I had installed LAMP on my machine (not on a virtual machine), I could easily add/edit/delete files in the var/WWW folder. Where is the equivalent of "WWW" folder on Virtualbox and how can I interface with it? Thanks,

    Read the article

  • MySQL Backup: Can I copying individual MyISAM table files to another server with different MySQL ver

    - by Komputer
    I means copying individual MyISAM table files is: (shut down mysqld and copy the .frm, .myd, and .myi files from one database folder to another) Question: (a) can I use this way to backup MySQL database folder from one server to another server with different MySQL version? (b) can this backup files moved to different OS? (example: debian to centos) thanks in advance.

    Read the article

  • Best way to rip DVD movies to ISO files

    - by alex
    I'm trying to backup my DVD collection. I have Handbrake, and will eventually experiment with the best settings to use. For now, I'd like to backup the DVD's to ISO files, that i can mount and then use Handbrake on later, or burn back on to DVD should the original get damaged. I have a WD TV box that is capable of playing ISO files also. What's the best program for doing this? I'm not so much concerned with file size.

    Read the article

  • How to extract corrupted RAR files

    - by Anyname Donotcare
    I try to extract a file from set of .RAR parts. I always get the following error: Corrupted so I tried many programs to extract the corrupted RAR files. I have 10 parts, each one 400MB big. The last progam I used was Recovery toolbox for RAR. It gave me the following message: Severity:!High Message:Recovery error for file XXXX with message set file pointer error. The program extracted 2GB of data, but it should extract 4GB. Is there any way to extract the content or am I supposed to download the files again from another server?

    Read the article

  • Dreamweaver Files uploaded to Win 2008 server cause login prompt

    - by Lil
    I have a customer who uses a 4 year old version of Dreamweaver to edit her webpages. My hosting reseller account is with a company that uses Windows Server 2008. Every time my customer edits a page and uploads it, I have to set the permissions for that file to be readable, manually from the site's control panel. The customer is furious with me because her files cause the login prompt. I am able to upload files myself that remain readable to the site with both Filezilla and with Frontpage. I am assuming that her Dreamweaver settings are the cause of the problem but I don't have that program myself and don't know what to advise her. Any suggestions?

    Read the article

< Previous Page | 93 94 95 96 97 98 99 100 101 102 103 104  | Next Page >