Search Results

Search found 40294 results on 1612 pages for 'dot files'.

Page 52/1612 | < Previous Page | 48 49 50 51 52 53 54 55 56 57 58 59  | Next Page >

  • Convert video files for home IIS server [closed]

    - by Jey
    I am finally learning to set up an IIS server (personal use only) and I thought it would be cool to have some videos on it for me to watch when I am away from home. Since I'm usually on 3G (iPhone) or work wifi, I'd like to convert them to an optimal format that will stream fast. The video files are mostly avi and mp4 (from 30 minutes to 2 hours in length). What would be an easy and fast way to go about doing this? Thanks.

    Read the article

  • How can I search files on ubuntu?

    - by asdffdg
    How can I search files on ubuntu ??? The usual search tool does not find anything . I have installed tracker search tool and it too does not find anything .I tried to follow the instructions found to enable this tool by going to systempreferencessearching and indexing but Where the hell is systempreferencessearching and indexing? I found a program called searching and indexing but it does not contain anything that is described in the instructions .

    Read the article

  • where are apt-get files stored?

    - by loolooyyyy
    i have an ubuntu with three virtual machines running inside running ubuntu also i update the host using apt-get update but i can update VM's, it takes a long time + uses a lot of bandwith which i'm running out of,i want to transfer the updated files by apt-get to VMs, could you please tell me where are they? i'm not talking about the packages themselves stored in /var/cache/apt/archives, i want the file that stores list of available packages on mirror i have selected, thanks lot ps: i know this question has been asked somewhere but i cant find it!

    Read the article

  • SQL Server Semantic Search to Find Text in External Files

    Sometimes it is necessary to search for specific content inside documents stored in a SQL Server database. Is it possible to do this in SQL Server? Can I run T-SQL queries and find content inside Microsoft Word files? Yes, now with SQL Server 2012 you can do a semantic search. 12 essential tools for database professionalsThe SQL Developer Bundle contains 12 tools designed with the SQL Server developer and DBA in mind. Try it now.

    Read the article

  • Win7 playback of dvr-ms files stutters

    - by Jim Lynn
    I've just had to install Windows 7 on my Media Center machine because my Vista installation had a faulty drive. I've got the latest drivers that I can find - Intel 945GM integrated Graphics, Realtek audio drivers. Things are working OK with one exception. Playback of old recordings, from dvr-ms format files, is choppy. The picture freezes for a fraction of a second, then quickly catches up. The sound is uninterrupted and doesn't pause. These freezes happen once every 5 seconds or so. It's very regular. Playback of Live TV from the digital tuner is perfectly smooth. DVD playback is perfectly smooth. As an experiment, I used the MPEG editing package VideoReDo to create a small test file in three different formats. This program takes the raw MPEG streams and repackages them into the desired container. I took the same clip and created three files in three formats: dvr-ms (Microsoft's old recorded TV format); mpg (standard MPEG); and ts (raw MPEG transport stream of the kind often produced by PVRs). When these three files are played back under Windows 7, the mpg and ts files play smoothly, but the dvr-ms file stutters. The last piece of data I have is that two other Windows 7 machines can play back dvr-ms files smoothly with no stuttering. One is a netbook, with less grunt than the media centre. So there must be something specific about my Media Center machine that's causing the problem. Does anyone have any idea where I can look now? I don't know much about AV software, codecs, filter graphs etc. but I suspect that's where the problem lies. Rendering the video isn't the problem, but extracting the streams is. How would I go about diagnosing the problem? Edited to add: I just used the GraphStudio tool to look at the filter graph on the offending PC. The filter graph it uses by default for dvr-ms looks identical to the other machines, and, interestingly, when I play the files using GraphStudio they run smoothly. Under Windows Media Player and Windows Media Center they stutter. I'd like to see the filter graph for WMP but GraphStudio won't show it. It looks like WMP and WMC are using a different decoding path to GraphStudio. Edited again to add: Today I purchased a new HDTV. The same Media Center driving the TV at 1080p is now playing back the old Recorded TV files smoothly, without stuttering. So whatever the cause of the original problem, using a different resolution seems to have removed the problem. It might also explain why nobody else has had this problem. I doubt many people use Media Centre with a 14in portable TV.

    Read the article

  • Can someone provide an example of seeking, reading, and writing a >4GB file using boost iostreams

    - by Queueless
    I have read that boost iostreams supposedly supports 64 bit access to large files semi-portable way. Their FAQ mentions 64 bit offset functions, but there is no examples on how to use them. Has anyone used this library for handling large files? A simple example of opening two files, seeking to their middles, and copying one to the other would be very helpful. Thanks.

    Read the article

  • Windows 7 playback of dvr-Microsoft files stutters

    - by Jim Lynn
    I've just had to install Windows 7 on my Media Center machine because my Vista installation had a faulty drive. I've got the latest drivers that I can find - Intel 945GM integrated Graphics, Realtek audio drivers. Things are working OK with one exception. Playback of old recordings, from dvr-Microsoft format files, is choppy. The picture freezes for a fraction of a second, then quickly catches up. The sound is uninterrupted and doesn't pause. These freezes happen once every 5 seconds or so. It's very regular. Playback of Live TV from the digital tuner is perfectly smooth. DVD playback is perfectly smooth. As an experiment, I used the MPEG editing package VideoReDo to create a small test file in three different formats. This program takes the raw MPEG streams and repackages them into the desired container. I took the same clip and created three files in three formats: dvr-Microsoft (Microsoft's old recorded TV format); mpg (standard MPEG); and ts (raw MPEG transport stream of the kind often produced by PVRs). When these three files are played back under Windows 7, the mpg and ts files play smoothly, but the dvr-Microsoft file stutters. The last piece of data I have is that two other Windows 7 machines can play back dvr-Microsoft files smoothly with no stuttering. One is a netbook, with less grunt than the media centre. So there must be something specific about my Media Center machine that's causing the problem. Does anyone have any idea where I can look now? I don't know much about AV software, codecs, filter graphs etc. but I suspect that's where the problem lies. Rendering the video isn't the problem, but extracting the streams is. How would I go about diagnosing the problem? Edited to add: I just used the GraphStudio tool to look at the filter graph on the offending PC. The filter graph it uses by default for dvr-Microsoft looks identical to the other machines, and, interestingly, when I play the files using GraphStudio they run smoothly. Under Windows Media Player and Windows Media Center they stutter. I'd like to see the filter graph for Windows Media Player but GraphStudio won't show it. It looks like Windows Media Player and WMC are using a different decoding path to GraphStudio. Edited again to add: Today I purchased a new HDTV. The same Media Center driving the TV at 1080p is now playing back the old Recorded TV files smoothly, without stuttering. So whatever the cause of the original problem, using a different resolution seems to have removed the problem. It might also explain why nobody else has had this problem. I doubt many people use Media Centre with a 14in portable TV.

    Read the article

  • Dreamweaver - template not recognising child files until they are opened

    - by Chris
    Hi I've got a new section for a website which I have generated for a data source and it has markup for using a Dreamweaver template. When I add the new files and folders to the site , then update my template , it doesn't find the new files to update. If I open one of the new files , make a change in the template , then it recognises the new file is using the template. So it's almost like I have to touch all the files with Dreamweaver first. I've tried to open all the new files which need to use the template but then Dreamweaver CS4 crashes, I presume because of the number of files it's opening. Anyway, does anyone know if there is a way to make Dreamweaver recognise that a block of new files belong to the template , it doesn't seem to just work automatically Thanks Chris

    Read the article

  • archiving (ubuntu tar) hidden directories

    - by broiyan
    tar on a directory "mydir" will archive hidden files and hidden subdirectories, but tar from within "mydir" with a wildcard will not. Is this a longstanding and known inconsistency or bug or is it that hardly anybody ever looks inside a lengthy tar log long enough to notice? Edit (additional information): tar from within "mydir" with a wildcard will not "see" nor archive hidden files and hidden subdirectories in the immediate directory, with emphasis on "immediate". However, in subdirectories of "mydir" (obviously non-hidden) hidden files and hidden subdirectories will be archived.

    Read the article

  • File Storage for Web Applications: Filesystem vs DB vs NoSQL engines

    - by El Yobo
    I have a web application that stores a lot of user generated files. Currently these are all stored on the server filesystem, which has several downsides for me. When we move "folders" (as defined by our application) we also have to move the files on disk (although this is more due to strange design decisions on the part of the original developers than a requirement of storing things on the filesystem). It's hard to write tests for file system actions; I have a mock filesystem class that logs actions like move, delete etc, without performing them, which more or less does the job, but I don't have 100% confidence in the tests. I will be adding some other jobs which need to access the files from other service to perform additional tasks (e.g. indexing in Solr, generating thumbnails, movie format conversion), so I need to get at the files remotely. Doing this over network shares seems dodgy... Dealing with permissions on the filesystem as sometimes given us problems in the past, although now that we've moved to a pure Linux environment this should be less of an issue. What are the downsides of storing files as BLOBs in MySQL? I guess that it would massively increase the database size and reduce the effectiveness of caches, but are there other problems? Do the same problems exist with NoSQL systems like Cassandra? Does anyone have any other suggestions that might be appropriate?

    Read the article

  • Sort files by name in Java differs from Windows Explorer

    - by Martyn Hopkins
    I have a simple Java program which reads a file directory and outputs a file list. I sort the files by name: String [] files = dirlist.list(); files = sort(files); My problem is that it sorts by name in a different way than Windows Explorer does. For instance if I have these files: abc1.doc, abc12.doc, abc2.doc. Java will sort like this: abc1.doc abc12.doc abc2.doc When I open the folder in Explorer, my files are sorted like this: abc1.doc abc2.doc abc12.doc How can I make Java sorts my files like in Windows Explorer? Is this a Windows trick?

    Read the article

  • Working with multiple input and output files in Python

    - by Morlock
    I need to open multiple files (2 input and 2 output files), do complex manipulations on the lines from input files and then append results at the end of 2 output files. I am currently using the following approach: in_1 = open(input_1) in_2 = open(input_2) out_1 = open(output_1, "w") out_2 = open(output_2, "w") # Read one line from each 'in_' file # Do many operations on the DNA sequences included in the input files # Append one line to each 'out_' file in_1.close() in_2.close() out_1.close() out_2.close() The files are huge (each potentially approaching 1Go, that is why I am reading through these input files one at a time. I am guessing that this is not a very Pythonic way to do things. :) Would using the following form good? with open("file1") as f1: with open("file2") as f2: # etc. If yes, could I do it while avoiding the highly indented code that would result? Thanks for the insights!

    Read the article

  • Default permission for newly-created files/folders using ACLs not respected by commands like "unzip"

    - by Ngoc Pham
    I am having trouble with setting up a system for multiple users accessing the same set of files. I've read tuts and docs around and played with ACLs but haven't succeeded yet. MY SCENARIO: Have multiple users, for example, user1 and user2, which is belong to a group called sharedusers. They must have all WRITE permission to a same set of files and directories, say underlying in /userdata/sharing/. I have the folder's group set to sharedusers and SGID to have all newly created files/dirs inside set to same group. ubuntu@home:/userdata$ ll drwxr-sr-x 2 ubuntu sharedusers 4096 Nov 24 03:51 sharing/ I set ACLs for this directory so I can have permission of sub dirs/files inheritted from its parents. ubuntu@home:/userdata$ setfacl -m group:sharedusers:rwx sharing/ ubuntu@home:/userdata$ setfacl -d -m group:sharedusers:rwx sharing/ Here's what I've got: ubuntu@home:/userdata$ getfacl sharing/ # file: sharing/ # owner: ubuntu # group: sharedusers # flags: -s- user::rwx group::r-x group:sharedusers:rwx mask::rwx other::r-x default:user::rwx default:group::r-x default:group:sharedusers:rwx default:mask::rwx default:other::r-x Seems okay as when I create new folder with new files inside and the permission is correct. ubuntu@home:/userdata/sharing$ mkdir a && cd a ubuntu@home:/userdata/sharing/a$ touch a_test ubuntu@home:/userdata/sharing/a$ getfacl a_test # file: a_test # owner: ubuntu # group: sharedusers user::rw- group::r-x #effective:r-- group:sharedusers:rwx #effective:rw- mask::rw- other::r-- As you can see, the sharedusers group has effective permission rw-. HOWEVER, if I have a zip file, and use unzip -q command to unzip the file inside the folder sharing, the extracted folders don't have group write permisison. Therefore, the users from group sharedusers cannot modify files under those extracted folders. ubuntu@home:/userdata/sharing$ unzip -q Joomla_3.0.2-Stable-Full_Package.zip ubuntu@home:/userdata/sharing$ ll drwxrwsr-x+ 2 ubuntu sharedusers 4096 Nov 24 04:00 a/ drwxr-xr-x+ 10 ubuntu sharedusers 4096 Nov 7 01:52 administrator/ drwxr-xr-x+ 13 ubuntu sharedusers 4096 Nov 7 01:52 components/ You an spot the difference in permissions between folder a (created before) and folder administrator extracted by unzip. And the ACLs of a files inside administrator: ubuntu@home:/userdata/sharing$ getfacl administrator/index.php # file: administrator/index.php # owner: ubuntu # group: ubuntu user::rw- group::r-x #effective:r-- group:sharedusers:rwx #effective:r-- mask::r-- other::r-- It also has ubuntu group, not sharedusers group as expected. Could someone please explain the problem and give me advice? Thank you in advance!

    Read the article

  • Visual Studio 2010: Publish minified javascript files instead of the original ones

    - by salgiza
    I have a Scripts folder, that includes all the .js files used in the project. Using the Ajax Minifier task, I generate .min.js files for each one. Depending on whether the application is running in debug or release mode, I include the original .js file, or the minified one. The Scripts folder looks like this: Scripts/script1.js Scripts/script1.min.js // Outside the project, generated automatically on build Scripts/script2.js Scripts/script2.min.js // Outside the project, generated automatically on build The .min.js files are outside the project (although in the same folder as the original files), and they are not copied into the destination folder when we publish the project. I have no experience whatsoever using build tasks (well, apart from including the minifier task), so I would appreciate if anyone could advise me as to which would be the correct way to: Copy the .min.js files to the destination folder when I publish the app from Visual Studio. Delete / Not copy the original js files (this is not vital, but I'd rather not copy files that will not be used in the app). Thanks,

    Read the article

  • Sending files through a webservice

    - by Jay
    Hi, I have to send some files through a webservice in C#. The files to be sent can be from different locations i.e. there is one folder having 4 files and another folder having 5 files. Assuming i have a mechanism to select which files to send. What would be the best way to send those files? Should I be sending them one by one and let the client figure out how to put them together, or zip all the files into a single file and send that zip file to the client. If there is any other way to implement this, I would be more than happy to look into that approach too. Thanks

    Read the article

  • Is it possible to store only a checksum of a large file in git?

    - by Andrew Grimm
    I'm a bioinformatician currently extracting normal-sized sequences from genomic files. Some genomic files are large enough that I don't want to put them into the main git repository, whereas I'm putting the extracted sequences into git. Is it possible to tell git "Here's a large file - don't store the whole file, just take its checksum, and let me know if that file is missing or modified." If that's not possible, I guess I'll have to either git-ignore the large files, or, as suggested in this question, store them in a submodule.

    Read the article

  • How to loop through all illustrator files in a folder (CS6)

    - by Julian
    I have written some JavaScript to save .ai files to two separate locations with different resolutions, one of them being cropped to a reduced size art board. (Courtesy of John Otterud / Articmill for the main part). There are other variables in the script that I am not using at present but I want to leave the functionality there for a later date/additional layers to export/other resolutions etc. I can't get it to loop through all files in a folder. I cannot find the script that works - or insert it at the right place. I can get as far a selecting the folder and I suppose creating an array but after that what next? This is the create array part of the script - // JavaScript Document //Set up vairaibles var destDoc, sourceDoc, sourceFolder, newLayer; // Select the source folder. sourceFolder = Folder.selectDialog('Select the folder with Illustrator files that you want to mere into one', '~'); destDoc = app.documents.add(); // If a valid folder is selected if (sourceFolder != null) { files = new Array(); // Get all files matching the pattern files = sourceFolder.getFiles(); I have inserted this at the beginning of the main script (probably where I am going wrong because I can select the folder but then nothing more) #target illustrator var docRef = app.activeDocument; with (docRef) { if (layers[i].name = 'HEADER') { layers[i].name = '#'+ activeDocument.name; save() } } // *** Export Layers as PNG files (in multiple resolutions) *** var subFolderName = "For_PLMA"; var subFolderTwoName = "For_VLP"; var saveInMultipleResolutions = true; // ... // Note: only use one character! var exportLayersStartingWith = "%"; var exportLayersWithArtboardClippingStartingWith = "#"; // ... var normalResolutionFileAppend = "_VLP"; var highResolutionFileAppend = "_PLMA"; // ... var normalResolutionScale = 100; var highResolutionScale = 200; var veryhighResolutionScale = 300; // *** Start of script *** var doc = app.activeDocument; // Make sure we have saved the document if (doc.path != "") { Then the rest of the export script runs on from there.

    Read the article

  • reading files provided via $_GET

    - by Max
    I have a php script which takes a relative pathname via $_GET, reads that file and creates a thumbnail of it. I dont want the user to be able to read any file from the server. Only files from a certain directory should be allowed, otherwiese the script should exit(). Here is my folder structure: files/ <-- all files from this folder are public my_stuff/ <-- this is the folder of my script that reads the files My script is accessed via mydomain.com/my_stuff/script.php?pathname=files/some.jpg. What should not be allowed e. g.: mydomain.com/my_stuff/script.php?pathname=files/../db_login.php So, here is the relevant part of the script in my_stuff folder: ... $pathname = $_GET['pathname']; $pathname = realpath('../' . $_GET['pathname']); if(strpos($pathname, '/files/') === false) exit('Error'); ... I am not really sure about that approach, doesnt seem too safe for me. Anyone with a better idea?

    Read the article

  • Coda 2 and SCP uploading files with the wrong permission

    - by Tom Black
    Currently I have a basic Ubuntu server running a website. The website is for a few students learning HTML/PHP and each student has their own account with a symbolic link to the shared website folder. Since the students are working on the website together, each user needs to be able to modify all the files (index.html for example). So I created a Webdev group containing all of the students with the default umask of 0002 set in their .bashrc (This allows newly created files to be 774). The shared folder is owned by the group Webdev with a chmod g+s so that new files/folders also belong to the group Webdev. The problem is that the students are using an IDE (Coda 2) and when they create a new file or folder using the IDE the file has the permissions of 644 on the server (not group writable). However when I make a new file through connecting with Cyberduck (SFTP client) the file permissions are 664 (as they should be). So I don't understand why Coda would be any different. However, after some trial and error I believe that Coda is first creating the file on local disk and then uploading that file to the server. On a mac by default a newly created file is 644. When the client uploads a file that's already 644 it stays 644 on the server side (umask is kind of useless in this situation). I've also tried creating ACL permissions for that folder but an uploaded file from my mac via SCP doesn't get the default ACL permissions. In Coda there is an option to change file permissions on a transfer. However this option seems to apply a chmod to all files being uploaded or saved. When one of students is modifying a file created by someone else when they try to upload the file or save it Coda tries to also do a chmod but fails because that user isn't the owner of the file. My current solution is using bindfs... I mount the shared web folder and bindfs sets permissions and group ownership of newly created files. However, bindfs seems to be a bit slow and I'm sure there is a better solution. Even if the students ditched Coda 2 and used Mac vim with scp the newly created files on the server would behave the same (644) which is default on the mac. Other options... 1) Either I teach the students to use (ssh/chmod) with their IDE to change their own file permissions when uploading. 2) I make all the students' Macs have the default umask of 0002 which would upload files with the right permissions. 3) Write a corn script to fix the file permissions every 5 to 15 minutes... (This option I think is the worst if students are working together at the same time). Is there any way that I could make all files that are uploaded via SCP have the default file permissions of 664 even though the uploaded file has a lower permission? (After hours of searching I don't think this is possible) I guess a corn script is my best option for novice users. How do web developers work together on larger sites? similar to this: http://serverfault.com/questions/283492/how-to-specify-file-permission-when-putting-a-file-using-openssh-sftp-command Also similar: http://serverfault.com/questions/395418/managing-linux-directory-permissions-sftp

    Read the article

  • Linux: prevent VNC from swapping like mad

    - by Weezy
    I'm accessing a MacMini (with MacOS X 10.4) from my Linux machine using VNC and there's an issue that is driving me crazy... My Linux machine has 4 GB of ram and I run a lot of various apps on it and I've got no issue at all. It's all snappy and don't hear the hard disk swapping/read/writing too often. Now with VNC, the hard disk is swapping like mad... When I'm moving things on the OS X desktop. So I was thinking of creating a ramdisk and forcing the temp VNC files to go into that ramdisk but the problem is I can't find any temp files. I've attempted to do that: #!/bin/bash while [ true ] do lsof | grep vnc done And eyeball parse the output to try to find some temp file: no luck. The VNC version I'm using is this one: $ vncviewer -version VNC Viewer Free Edition 4.1.1 for X - built Jan 30 2009 19:33:16 Copyright (C) 2002-2005 RealVNC Ltd. No matter how much data is coming from the Mac, there should be plenty of memory (4 GB of ram) so there's really no reason to swap like crazy. This is driving me mad. Any help as to how I could solve this problem is most welcome because this is literally driving me nuts.

    Read the article

< Previous Page | 48 49 50 51 52 53 54 55 56 57 58 59  | Next Page >