Search Results

Search found 62606 results on 2505 pages for 'sql files'.

Page 564/2505 | < Previous Page | 560 561 562 563 564 565 566 567 568 569 570 571  | Next Page >

  • Bsplayer - load audio tracks from external files

    - by torran
    I have a movie file: Video ID : 1 Format : AVC Format/Info : Advanced Video Codec Format profile : [email protected] Format settings, CABAC : Yes Format settings, ReFrames : 5 frames Muxing mode : Container [email protected] Codec ID : V_MPEG4/ISO/AVC Duration : 54mn 13s Bit rate : 3 380 Kbps Nominal bit rate : 3 459 Kbps Width : 1 280 pixels Height : 720 pixels Display aspect ratio : 16:9 Frame rate : 23.976 fps Resolution : 8 bits Colorimetry : 4:2:0 Scan type : Progressive Bits/(Pixel*Frame) : 0.153 Stream size : 1.28 GiB (88%) Writing library : x264 core 88 r1471 1144615 Audio ID : 2 Format : AC-3 Format/Info : Audio Coding 3 Codec ID : A_AC3 Duration : 54mn 16s Bit rate mode : Constant Bit rate : 384 Kbps Channel(s) : 6 channels Channel positions : Front: L C R, Side: L R, LFE Sampling rate : 48.0 KHz Stream size : 149 MiB (10%) and additional audio files in same folder: .mp3 and .ac3. How can I load them with bsplayer? Right click-audio-audio streams is empty. If i open the movie with media players classic I can switch audio files.

    Read the article

  • Rsync fails for files that start with underscore when destination is zfs

    - by Eric
    everyone. I'm using rsync3.1.0pre1 on Mac OS X 10.8.5, and am trying to rsync one folder to another. The destination is a ZFS volume mounted via SMB. The problem I'm having is that files that start with underscore (e.g., '_filename.jpg') are not being successfully synced to the destination. I get the following error message: rsync: mkstemp "/path/to/destination/._filename.jpg.NUgYJw" failed: Permission denied (13) In this case, '_filename.jpg' does not make it to the destination. I understand that rsync creates hidden, temporary files at the destination which are preceded with '.' and have a random file extension appended on the end. But the original filename starts with '', not '.', and I haven't asked rsync to copy extended attributes / resource forks over (unless it always does it). The rsync command I'm using is: rsync -avE --exclude='.DS_Store' --exclude '.Trash' --exclude 'Thumbs.db' --exclude '._*' --delete /source/ /destination/ Has anyone found a way around this problem? Thank you!

    Read the article

  • Looking for a way execute a task on all files in a directory (recursively) on Windows

    - by stzzz1
    I have a huge number of mp4 video files that needs to have a volume boost. I need a way to execute a ffmpeg audio filter on all files in a specified base directory (and in subdirectories as well). My problem is that I'm working on a Windows computer and I have no knowledge of its shell syntax. I would like to do the equivalent of what this bash script does : TARGET_FILES=$(find /path/to/dir -type f -name *.mp4) for f in $TARGET_FILES do ffmpeg -i $f -af 'volume=4.0' output.$f done I spent quite some time this afternoon looking for a solution but the recursive nature of what I need (that is so simple with find!) isn't too clear. Any help would be greatly appreciated!

    Read the article

  • Recovering database files from a corrupted VHD

    - by Apocalypse9
    We have a SQL server hosted on a virtual machine. Our hosting company updated/restarted the server and for some reason the virtual machines became unbootable. We've spoken to Microsoft and used a few higher level tools to attempt to recover the virtual machines but were unsuccessful. In browsing the file system the database folder doesn't even appear. I'm wondering if there are any lower level tools that might be able to find and copy the database files. As far as I know the physical hard drive is ok, so I'm hoping there may be some way to recover the files themselves even if the rest of the virtual machine file-system is a loss. Obviously we're in a bit of a bind, and any help/ suggestions are very much appreciated.

    Read the article

  • Uploading many large files to a remote server

    - by TiernanO
    I am in the process of creating an offsite backup, and need to do a initial load of data. Currently, that's about 400Gb, give or take 10Gb or so... The backup system is producing files which are about 4Gb each, and has some other, smaller related files also. So, i need to transfer all 400ish gigs to a remote server, but how? What is the best method? I have full remote access to the server, so i can install anything i need to install. There are Windows, Linux and a Solaris VM running on the box itself, so any of those can be used there, and i have Windows and Linux at home. I have 2 internet connections in house, 10Mb/s uploading on each, so something that could potentially split the number of connections would be handy (kind of like GetRight, but in reverse... PutRight?).

    Read the article

  • Is using .h as a header for a c++ file wrong?

    - by Chris Huang-Leaver
    Is using .h as a header for a c++ file wrong? I see it all over the place, especially with code written in the "C style". I noticed that Emacs always selects C highlighting style for a .h header, but c++ for hpp or hh. Is it actually "wrong" to label your headers .h or is it just something which annoys me? EDIT: There is a good (ish) reason why this annoys me, if I have project files labelled, 'hpp & cpp' I can get away with 'grep something *pp' etc. otherwise I have to type '.h cpp'

    Read the article

  • In need of a Smarter Environmental Package Configuration

    - by Jeremy Liberman
    I am trying to set up a package template in SSIS, following the Wrox Programmer to Programmer book, SQL Server 2008 Integration Services: Problem - Design - Solution. I'm really liking this book even though it is 2008 and we're using SQL Server 2005. I've got a working package template that uses an Indirect XML package configuration to identify what environment (local developer, dev, QA, production, etc) the package is being run in. That locates the SQL Server package configuration for the environment. That set-up is great and all except for the environment variable at the very front of it all. My team would prefer it if the package could use the same environment resource locator as all our other applications and tools use, so we don't two environment markers with essentially the same information in them. Normally we look up a registry key in HKey_Local_Machine but the Registry Package Configuration type only lets you look up the HKey_Current_User registries. My first thought was to write a new Package Configuration Type class that extends the Registry type; after all we'd had such luck writing our own custom log provider. SSIS is super extendable, right? So there doesn't seem to be a way to write your own Package Configuration Types. Is there still some way I can configure my SSIS SQL Server package configuration from a HKLM registry key connection string? If this is not possible, what other workarounds are available? My idea is to write a PowerShell script that will create/modify the Environment Variable that the package will use by fetching the connection string from the registry. This way there's still two markers, but at least then it's automatically maintained and automated. Is this kind of workaround necessary? Thank you for your time.

    Read the article

  • searching for hidden files using winapi

    - by Kristian
    HI i want to search for a hidden files and directories in a specefic given path but I don't know how to do it for hidden files i do know how to search for normal files and dir i did this code but im stuck can't make it search for only hidden files #include "stdafx.h" #include <windows.h> int _tmain(int argc, _TCHAR* argv[]) { TCHAR *fn; fn=L"d:\\*"; HANDLE f; WIN32_FIND_DATA data; { FILE_ATTRIBUTE_HIDDEN; } f=FindFirstFile(fn,&data); if(f==INVALID_HANDLE_VALUE){ printf("not found\n"); return 0; } else{ _tprintf(L"found this file: %s\n",data.cFileName); while(FindNextFile(f,&data)){ _tprintf(L"found this file: %s\n",data.cFileName); } } FindClose(f); return 0; }

    Read the article

  • Which is faster in memory, ints or chars? And file-mapping or chunk reading?

    - by Nick
    Okay, so I've written a (rather unoptimized) program before to encode images to JPEGs, however, now I am working with MPEG-2 transport streams and the H.264 encoded video within them. Before I dive into programming all of this, I am curious what the fastest way to deal with the actual file is. Currently I am file-mapping the .mts file into memory to work on it, although I am not sure if it would be faster to (for example) read 100 MB of the file into memory in chunks and deal with it that way. These files require a lot of bit-shifting and such to read flags, so I am wondering that when I reference some of the memory if it is faster to read 4 bytes at once as an integer or 1 byte as a character. I thought I read somewhere that x86 processors are optimized to a 4-byte granularity, but I'm not sure if this is true... Thanks!

    Read the article

  • Files copying between servers by creation time

    - by driftux
    My bash scripting knowledge is very weak that's why I'm asking help here. What is the most effective bash script according to performance to find and copy files from one LINUX server to another using specifications described below. I need to get a bash script which finds only new files created in server A in directories with name "Z" between interval from 0 to 10 minutes ago. Then transfer them to server B. I think it can be done by formatting a query and executing it for each founded new file "scp /X/Y.../Z/file root@hostname:/X/Y.../Z/" If script finds no such remote path on server B it will continue copying second file which directory exists. File should be copied with permissions, group, owner and creation time. X/Y... are various directories path. I want setup a cron job to execute this script every 10 minutes. So the performance is very important in this case. Thank you.

    Read the article

  • How do I effectively write to 146 output files in C++ using cstdlib library

    - by Elpezmuerto
    I have a very large binary file and I need to create separate files based on the id within the input file. There are 146 output files and I am using cstdlib and fopen and fwrite. FOPEN_MAX is 20, so I can't keep all 146 output files open at the same time. I also want to minimize the number of times I open and close an output file. How can I write to the output files effectively? I also must use the cstdlib library due to legacy code.

    Read the article

  • Processing a database queue across multiple threads - design advice

    - by rwmnau
    I have a SQL Server table full of orders that my program needs to "follow up" on (call a webservice to see if something has been done with them). My application is multi-threaded, and could have instances running on multiple servers. Currently, every so often (on a Threading timer), the process selects 100 rows, at random (ORDER BY NEWID()), from the list of "unconfirmed" orders and checks them, marking off any that come back successfully. The problem is that there's a lot of overlap between the threads, and between the different processes, and their's no guarantee that a new order will get checked any time soon. Also, some orders will never be "confirmed" and are dead, which means that they get in the way of orders that need to be confirmed, slowing the process down if I keep selecting them over and over. What I'd prefer is that all outstanding orders get checked, systematically. I can think of two easy ways do this: The application fetches one order to check at a time, passing in the last order it checked as a parameter, and SQL Server hands back the next order that's unconfirmed. More database calls, but this ensures that every order is checked in a reasonable timeframe. However, different servers may re-check the same order in succession, needlessly. The SQL Server keeps track of the last order it asked a process to check up on, maybe in a table, and gives a unique order to every request, incrementing its counter. This involves storing the last order somewhere in SQL, which I wanted to avoid, but it also ensures that threads won't needlessly check the same orders at the same time Are there any other ideas I'm missing? Does this even make sense? Let me know if I need some clarification.

    Read the article

  • Pushing Large Files to 500+ Computers [closed]

    - by WMIF
    I work with a team to manage 500-600 rented Windows 7 computers for an annual conference. We have a large amount of data that needs to be synced to these computers, up to 1 TiB. The computers are divided into rooms and connected through unmanaged gigabit switches. We prepare these computers ahead of time with the Windows installation and configuration, plus any files that we have available to us before we send the base image in for replication by the rental company. Every year, we have presenters approach on site with up to gigs of data that need to be pushed to the room that they will be presenting in. Sometimes they only have a few files that are small sizes, such as a slide PDF, but can sometimes be much larger 5 GiB. Our current strategy for pushing these files is using batch scripts and RoboCopy. For the large pushes, we actually use a BitTorrent client to generate a torrent file, and then we use the batch-RoboCopy to push the torrent into a folder on the remote machines that is being monitored by an installed BT client. Often times, this data needs to be pushed immediately with a small time window. We have several machines in a control room that are identical to the machines on the floor that we use for these pushes. We occasionally have a need to execute a program on the remote machines, and we currently use batch and PSexec to handle this task. We would love to be able to respond to these last minute pushes with "sorry, your own fault", but it won't happen. The BT method has allowed us to have a much faster response time, but the whole batch process can get messy when there are multiple jobs being pushed. We use Enterprise Ghost for other processes, and it doesn't work well in this large of scale, plus it is really quite expensive for a once-a-year task like this. EDIT: There is a hard requirement that the remote machines on the floor are running Windows. The control machines do not have a hard OS requirement. I would really like to stay away from Multicast because of complications with upstream routers. Is Multicast or BitTorrent the better way to go on this? Is there another protocol that might work better?

    Read the article

  • testing directory S_ISDIR acts inconsistently

    - by coubeatczech
    hi, I'm doing simple tests on all files in directory. But from some reason, sometimes, they behave wrongly? What's bad with my code? using namespace std; int main() { string s = "/home/"; struct dirent * file; DIR * dir = opendir(s.c_str()); while ((file = readdir(dir)) != NULL){ struct stat * file_info = new (struct stat); stat(file-d_name,file_info); if ((file_info-st_mode & S_IFMT) == S_IFDIR) cout << "dir" << endl; else cout << "other" << endl; } closedir(dir); }

    Read the article

  • how to programatically compare permissions of login/user in sql server 2005

    - by titanium
    There's a login/user in SQL Server who is having a problem importing accounts in production server. I don't have an idea what method he is doing this. According to the one importing, this import is working fine in development server. But when he did the same import in production it is giving him errors. Below are the errors he is getting for each accounts. 2009-06-05 18:01:05.8254 ERROR [engine-1038] Task [1038:00001 - Members]: Step 1.0 [<Insert step description>]: Task.RunStep(): StoreRow has failed 2009-06-05 18:01:05.9035 ERROR [engine-1038] Task [1038:00001 - Members]: Step 1.0 [<Insert step description>]: Task.RunStep(): StoreRow exception: Exception caught while storing Data. [Microsoft][ODBC SQL Server Driver][SQL Server]'ACCOUNT1' is not a valid login or you do not have permission. Please note that 'ACCOUNT1' is not the real account name. I just changed it for security reason. Using SQL Server Management Studio (SSMS), I viewed/checked the permissions of the user/login who is performing the import from development server and production for comparison. I found no difference. My question is: Is there a way to programmatically query permissions in server and database level of a particular login/user so I can compare/contrast for any differences?

    Read the article

  • Moving Images from Database to File System

    - by msarchet
    So currently in our system we have been storing image files in the database (SQL Express 2005). Unfortunately it wasn't perceived that this would reach the max database size allowed by the SQL Express License. So I have proposed a plan of storing the images in the file system and only indexing where the file is in the database. The plan is to save the root path in our OptionsTable as something like ImagesRoot and then only saving the actual imageID in the table, which is basically a FK from the PK of the record with the image. I have determined that it would be best to then split this down into sub-directories inside of the ImagesRoot based on every 1000 images so basically (ImagedID / 1000)\(ImageID % 1000) (e.g. ImageID is 1999 it would be in %ImageRoot%\1\999). I'm looking for any potential pitfalls of this system and any thing that could be improved as I am already receiving some resistance from the owner of the company who wants everything to be in databases. Along those lines I would also take reasons why it should all be in databases. I should mention we have in place already automated backups that run for all of our customers databases and any files that are generated by our program that are required to be saved over a period of time These are optional but if someone isn't using our system it is expected that they are using their own or data loss isn't our problem (it is if our system fails and they are using it!). Thanks

    Read the article

  • write a batch file to copy files from one folder to another folder

    - by user73628
    I am having a storage folder on network in which all users will store their active data on a server now that server is going to be replaced by new one due to place problem so I need to copy sub folders files from the old server storage folder to new server storage folder. I have below ex: from \Oldeserver\storage\data & files to \New server\storage\data & files.

    Read the article

  • VS: Separating headers from source files?

    - by jco
    I know this is completely subjective, but I'm curious: do you use separate filters for headers and source files in your Visual Studio solutions? Visual Studio creates "Header Files" and "Source Files" filters by default. To me, this dichotomy causes more annoyance than anything else. What's your take on this?

    Read the article

  • Windows Server 2008 scheduled tasks cannot create files

    - by Nick Cartwright
    We have a series of tasks which, when run interactively over the command line run fine creating temporary files and (importantly) logs and backups. When we schedule the task with Administrator privileges to run at the highest priority, however, no logs or temporary files are created! All the directories have read/write privileges as administrator. Has anyone else experienced this?? We are running Windows 2008 Server & the job is configured for 'Windows Vista or Windows Server 2008'. Any help would be much appreciated! OK - so we installed Z-Cron and it works perfectly.... Still a really really strange error from Windows 2008 Task Scheduler, but a solution is perhaps not quite so urgent now we have Z-Cron working!

    Read the article

  • unable to transfer files from handy cam to PC

    - by user143989
    I am using a Windows 7 PC,I am using sony dcr -sr88 handy cam . I need to transfer all my videos from handycam to my PC. when i try to connect to the PC through USB. it detects the usb drive in the Handycam on my PC and shows the used memory. But when i open the folder it shows "folder is empty". How i can copy the files? I have tried following: Changed the USB cable CHanged the USB port I can play the videos through handicam, but those files not visible in PC when connected in USB mode. Please help ..bit urgent!

    Read the article

  • Boost Include Files in VC++

    - by Dr. K
    For the last few years, I have been exclusively a C# developer. Previously, I developed in C++ and have a C++ application that I built about 3 years ago using VS2005. It made extensive use of the Boost libraries. I recently decided to brush off the old app and rebuild it in VS2008 with the latest version of Boost (the latest version with the "easy" installation program from BoostPro Computing), 1.39. Previously when I had the program running I was at 1.33. Also, the last time the program was running was at least 2 OS installations ago. The Boost installation is located on my machine at: "C:\Program Files\boost\boost_1_39". Anyway, I have done the following: Set the project's "Additional Include Directories" directory to "C:\Program Files\boost\boost_1_39" Added "C:\Program Files\boost\boost_1_39" to VS2008's Tools - Options - Projects and Solutions - VC++ Directories - Include Files I have a number of Boost includes in my stdafx.h file. The compiler fails upon attempting to open the first one - #include <boost/algorithm/string/string.hpp> I have confirmed that the above file is indeed located at "C:\Program Files\boost\boost_1_39\boost\algorithm\string\string.hpp" I continue to get: fatal error C1083: Cannot open include file: 'boost/algorithm/string/string.hpp': No such file or directory Any tips on what else to check would be greatly appreciated. Again, this is an application that compiled fine a few years ago, but the source has now been moved to a new machine/compiler.

    Read the article

  • Missing Memory on Windows Server 2008

    - by Chris Lively
    I have a windows 2008 x64 server with 8GB of RAM installed. Task Manager and Resource Monitor both insist that 7.5GB of the RAM is in use. However, the memory list under Processes (Memory Private Bytes) doesn't add up. I do have Show Processes from all users checked and hand adding the numbers I come up with about 3.5GB of RAM. I also looked at the latest copy of SysInternals Process Explorer. And neither the Private Bytes or Working Set adds up to more than about 3.5GB of RAM in use. What's going on? ===== Update: I bounced the server to see what would happen with the memory utilization. After boot and regular operations began it sat at 3GB of RAM usage. 18 hours later, it's back up to 6.8GB of usage with no indication as to where the additional 3.5GB or so of RAM is being used. Here are links to screen shots of the resource monitor and task manager: Resource Monitor Task Manager Update 2: Well, I believe I located the problem. When I detached one of the larger databases from my sql server the amount of ram shown as "in use" dropped drastically. The Memory Private Bytes count barely moved. So I'm guessing that SQL server has some way of allocating memory where it doesn't really show up in any of the monitors. I went further and created a new database file, then transferred all of the data from the one I detached. Even though it has the same data, and the same transactions going through it, the memory in use has stayed low. Maybe there was some corruption in the DB? I'll leave it to the DB gods and go searching for another "problem" ;)

    Read the article

< Previous Page | 560 561 562 563 564 565 566 567 568 569 570 571  | Next Page >