Search Results

Search found 72103 results on 2885 pages for 'file storage'.

Page 67/2885 | < Previous Page | 63 64 65 66 67 68 69 70 71 72 73 74  | Next Page >

  • How many disk should I use to meed the capacity and IOPS need?

    - by facebook-100005613813158
    An application needs 1.6 TB of storage capacity and performs 1000 IOPS. How many disks are required to meet the application requirement and offer acceptable response time? The disk specifications are as follows: - Drive capacity = 100 GB - 15K RPM - Each disk can perform 50 IOPS 4 candidate , 10,12,16,20, which one is the most likely answer? in my opinion,16 disks can only meet the capacity need ,but cannot meet the IOPS need, so ,the right answer should be 20 disks?right?

    Read the article

  • jungledisk fails with libnotify error

    - by Angelo
    Has anyone had success getting the jungledisk application to work under Ubuntu? I installed it from the .deb file provided by jungledisk. The install goes fine, but I can't get the "jungle disk desktop" app to launch. It appears in the dash search bar, but doesn't launch or do anything upon selecting it. When I try the command line, I get the following... $ jungledisk -V -f Verbose mode enabled Shutting down... $ I get something more interesting with the following command ... something about libnotify.so $ junglediskdesktop -V -f junglediskdesktop: error while loading shared libraries: libnotify.so.1: cannot open shared object file: No such file or directory Does anyone have suggestions for what to try?

    Read the article

  • Efficient way to check for changes to the contents of folders

    - by MrVimes
    I am creating an application that maintains a database of files of a certain type in a given folder (and all subfolders) Initially the program will recurse the folders and add any file it finds of that type to the database. I want the application to have the ability to re-scan the folder and add any files that were not there the last time the folders were scanned. It can't use the date created property of the file because there is a high chance of a file being added to the folders that isn't a new file. I am wondering what the most efficient way of doing this is, and if there is a way that doesn't involve checking each file is in the database already (which, if there are 5000 files would mean 5000 queries of a list 5000 items in size, or 25 million 'checks' for the sql engine to perform) I suppose a more specific question to acheive the same goal would be - is there a property of a file (in Microsoft Windows) that will reliably tell you when that file arrived in that folder.

    Read the article

  • Photo sharing social platform [closed]

    - by user1696497
    I am working on a photo sharing social platform like Flickr, Photobucket. To start off with I have half a million photos as of now. I want to convert all of these into a single format, compression ratio and use it as an original image. I will be storing original image, re-sized image according to layout and a thumbnail. I have started off with ruby, didn't find supporting libraries. I am considering python as it has a good image processing library and instagram is using it. I want some advise about how the image has to be processed while uploading, efficient way of storage whether database or a file system, image compressions, and precautions to be taken. I would be having profile pictures, do I need store them separately or along with the images? If I want to store the images on a file system, which file system should I use and also should I store the url or should I use any intermediate key value store like redis?

    Read the article

  • Best way to rename existing unique field names in database?

    - by Rajdeep Siddhapura
    I have a database table that contains id, filename, userId id is unique identifier filename should also be unique table may contain 10000 records When a user uploads a file it should be entered in database with given rules: If there is no record with same filename, it should be added as it is (Ex. foobar.pdf) If there is record with same filename, it should be added as uploadedName(2).ext (foobar(2).pdf) If there are n records with same base filename (foobar), it should be added as uploadedName(n+1).ext (foobar(20).pdf) Now if foobar(2).pdf is uploaded, it should be added as foobar(2)(2).pdf & so on This pattern needs to be followed because the file is already being uploaded at client side using ajax before sending the details to server and the file hosting service follows the above rules to name the files. My solution: maintain a file that contains all the names and the number of times it has occurred. if a filename that exists in file is entered, increase occurrence count and new name is generated, else add to it to file if the new name generated is in database, add it to file and generate new name

    Read the article

  • SFTP jail & Keeping file ownership the same / File owner per folder

    - by Dragonshadow
    I want to setup a jailed SFTP account for a subfolder of another user's home folder, but want the owner of everything in that subfolder to stay the same, including new files and folders uploaded and created by the sftp user, while still allowing access to the files and folders of that subfolder as if the SFTP user was the parent user. rawny bawb-sftp /home/rawny <- rawny owns this /home/rawny/sftp <- rawny owns this too, but bawb-sftp can upload to it, edit files, etc bawb-sftp uploads a file /home/rawny/sftp/lol.txt rawny should still own the file, as if he made it in the first place, even though bawb-sftp was the one that uploaded it. Basically I guess I'm asking for an sftp jail that acts as a highly limited passthrough/puppet for another user?

    Read the article

  • I started getting a weird message "Encrypting file system - Back up your file encryption key"

    - by user22559
    Hello I started getting a strange message when I start my computer. An icon appears in the system tray, and a popup tells me "Encrypting file system - Back up your file encryption key". I know what EFS is, but I don't use it. To my knowledge, I don't have any encrypted files on my partition. I have searched using Total Commander on all the partitions for files that have the "encrypted" attribute, but I found nothing. So I don't have any encrypted files. Does anyone know what I did to get this message?

    Read the article

  • Pysdm has disabled my ability to write to my storage partition

    - by Atlas
    I have a dual boot setup with Windows 7 and Mint 13 Cinnamon. As well as their respective partitions I also have a large one (NTFS) for storing all my music, videos, documents etc. I downloaded pysdm as I was told it would enable me to configure Linux to auto-mount my storage partition. It has indeed been helpful in auto-mounting my storage. However, since installing it I can no longer write to the partition which makes 500GB of my hard drive utterly useless! I've tried to unselect the "Mount file system in read only mode" option, but the program keeps re-checking it after I close that window (and even when I click apply). Why is it doing this and how can I get it to recognise that I need to read AND write on that partition?

    Read the article

  • merge values from excel file into .html file opened in word 2007

    - by Kelbizzle
    I have this newsletter I've written. I want to be able to use the values in the rows and some how have them merged into this html file I have opened in word. Sort of like a mail merge. In the newsletter I have 3 urls that look like: www.mydomain.com/php?id= I want to be able to replace all of the urls for all 230 records in the excel file. With something like: www.mydomain.com/?id=$id Where $id would get replaced with the id of the record. And the same goes for the rest of the rows like $firstname $lastname $email $phone number Is there a simple way to do this?

    Read the article

  • Ubuntu 12.04 Samba File Server timeout on large file

    - by phileaton
    I am a beginner with servers. I checked the error logs for Samba and it appears that Samba is timing out when I transfer large files. I can successfully add pdfs for instance to my file server. However, I tried to add a large 1.2gb video file and it did not succeed. This is the error in the log: smbd/process.c:244(read_packet_remainder) read_fd_with_timeout failed for client 0.0.0.0 read error = NT_STATUS_CONNECT$ Is there a way I can stop it from timing out? Any pointers would be great! Thanks!

    Read the article

  • I started getting a weird message "Encrypting file system - Back up your file encryption key"

    - by Ove
    I started getting a strange message when I start my computer. An icon appears in the system tray, and a popup tells me "Encrypting file system - Back up your file encryption key". I know what EFS is, but I don't use it. To my knowledge, I don't have any encrypted files on my partition. I have searched using Total Commander on all the partitions for files that have the "encrypted" attribute, but I found nothing. So I don't have any encrypted files. Does anyone know what I did to get this message?

    Read the article

  • Mountable online storage (no syncing)

    - by Sam
    I have a Linux VPS that I would like to turn into a media server. Like most cheap VPS's, it has a fairly small storage capacity. What I would like to do is attach the box to an online backup system such as SpiderOak or DropBox, where the files would reside and be directly accessible to either a webserver or media server software. Since the VPS hdd is small, I do not want the files to be synced to it. I would like a storage system that is online only. Ideally mountable like a network drive. Are there any services that suit my needs, or workarounds for services such as SpiderOak that do not require syncing?

    Read the article

  • Bacula Director and Storage in LAN

    - by B14D3
    I have two networks LAN and DMZ.. Machines in DMZ are accesible from internet ( only over http). In LAN I have servers that see all LAN and all DMZ machines but machinse from DMZ don't see any LAN servers. Machines in LAN have access only to all LAN and DMZ, no direct access to internet and no access from internet. DMZ <------ LAN DMZ ----X--->LAN I'm planning to configure Bacula as major backup system. My plan is to install Bacula Director and Storage deamon on the same server in LAN for safety reasons. So my question is: Will this configuration work, is it posible for bacula director and storage deamon installed on server in LAN to makes backup servers that are in my DMZ? Or in this network configuration Bacula should be in DMZ? (If yes will I can backup with it servers in LAN ?)

    Read the article

  • Have both domain and non-domain users use NetApp CIFS storage

    - by zladuric
    My specific use case is that I have a NetApp CIFS storage that's in the domain, say, intranet. But I also have one hyper-v host that's not in the domain. I can't allow it into the domain, but I need to create guests whose VHD is on the storage. How can I achieve this? When I simply connect to a CIFS share and create a VHD there, it works, but when I try to add that VHD to the virtual machine, I get a "Failed to set folder permissions" error - probably due to folder ownership being in the domain admin. Is there a way around this? So both domain and non-domain users use the same directory?

    Read the article

  • Intel Rapid Storage Technology service always crashes

    - by Massimo
    I'm running Windows 7 x64 on a system based on an Asus Z87-Deluxe motherboard; the storage is configured for RAID mode; there is a single SSD drive for the O.S. and two 4-TB disks in a RAID 1 setup for the data. I've installed the latest version of Intel's Rapid Storage Technology drivers, 12.8.0.1016. The program complains about its service not being running, and the service is actually stopped; if I try to start it, it crashes. I've already tried reinstalling the package, but nothing changed. All the disks work correctly, but the RST program is unusable. How can I fix this?

    Read the article

  • Is there a way to force/manage file locking in windows?

    - by JPbuntu
    I have a 2 Windows machines networked and I am having trouble with simultaneous access to files. I would like only one user to be able to open a file at a time, which I thought was automatic, using file locks.... if the program used to access the file is locking the file. I believe the problem I am having is some of the programs I use, don't lock the file, and there for can be modified simultaneously by multiple users, which is very much not desired. Currently I am having this problem with only two computers, although as soon as I can figure out a solution to this problem the network is going to be expanded to 6 computers, which will include Windows 7, Vista, and XP, as well as a central file server (Samba). Is there a way to ensure that all files opened in windows get locked? Any suggestions are appreciated, thanks.

    Read the article

  • replaceAll() method using parameter from text file

    - by Herman Plani Ginting
    i have a collection of raw text in a table in database, i need to replace some words in this collection using a set of words. i put all the term to be replace and its substitutes in a text file as below min=admin lelet=lambat lemot=lambat nii=nih ntu=itu and so on. i have successfully initiate a variabel of File and Scanner to read the collection of the term and its substitutes. i loop all the dataset and save the raw text in a string in the same loop i loop all the term collection and save its row to a string name 'pattern', and split the pattern into two string named 'term' and 'replacer' in this loop i initiate a new string which its value is the string from the dataset modified by replaceAll(term,replacer) end loop for term collection then i insert the new string to another table in database end loop for dataset i do it manualy as below replaceAll("min","admin") and its works but its really something to code it manually for almost 2000 terms to be replace it. anyone ever face this kind of really something.. i really need a help now desperate :( package sentimenrepo; import javax.swing.*; import java.sql.*; import java.io.*; //import java.util.HashMap; import java.util.Scanner; //import java.util.Map; /** * * @author herman */ public class synonimReplaceV2 extends SwingWorker { protected Object doInBackground() throws Exception { new skripsisentimen.sentimenttwitter().setVisible(true); Integer row = 0; File synonimV2 = new File("synV2/catatan_kata_sinonim.txt"); String newTweet = ""; DB db = new DB(); Connection conn = db.dbConnect("jdbc:mysql://localhost:3306/tweet", "root", ""); try{ Statement select = conn.createStatement(); select.executeQuery("select * from synonimtweet"); ResultSet RS = select.getResultSet(); Scanner scSynV2 = new Scanner(synonimV2); while(RS.next()){ row++; String no = RS.getString("no"); String tweet = " "+ RS.getString("tweet"); String published = RS.getString("published"); String label = RS.getString("label"); clean2 cleanv2 = new clean2(); newTweet = cleanv2.cleanTweet(tweet); try{ Statement insert = conn.createStatement(); insert.executeUpdate("INSERT INTO synonimtweet_v2(no,tweet,published,label) values('" +no+"','"+newTweet+"','"+published+"','"+label+"')"); String current = skripsisentimen.sentimenttwitter.txtAreaResult.getText(); skripsisentimen.sentimenttwitter.txtAreaResult.setText(current+"\n"+row+"original : "+tweet+"\n"+newTweet+"\n______________________\n"); skripsisentimen.sentimenttwitter.lblStat.setText(row+" tweet read"); skripsisentimen.sentimenttwitter.txtAreaResult.setCaretPosition(skripsisentimen.sentimenttwitter.txtAreaResult.getText().length() - 1); }catch(Exception e){ skripsisentimen.sentimenttwitter.lblStat.setText(e.getMessage()); } skripsisentimen.sentimenttwitter.lblStat.setText(e.getMessage()); } }catch(Exception e){ skripsisentimen.sentimenttwitter.lblStat.setText(e.getMessage()); } return row; } class clean2{ public clean2(){} public String cleanTweet(String tweet){ File synonimV2 = new File("synV2/catatan_kata_sinonim.txt"); String pattern = ""; String term = ""; String replacer = ""; String newTweet=""; try{ Scanner scSynV2 = new Scanner(synonimV2); while(scSynV2.hasNext()){ pattern = scSynV2.next(); term = pattern.split("=")[0]; replacer = pattern.split("=")[1]; newTweet = tweet.replace(term, replacer); } }catch(Exception e){ e.printStackTrace(); } System.out.println(newTweet+"\n"+tweet); return newTweet; } } }

    Read the article

  • How to handle editing a large file for a non-technical user

    - by Luke
    I have a client who is given a tab delimited .txt file containing hundreds of thousands of rows. I have a user story as follows: As a user I want to take the text file and add a new value at the end of each line which contains the concatenated value of two of the columns. for example if the file read text_one text_two I need to output the following (preferably to a .txt file) text_one text_two text_onetext_two My first approach was to ask the vendor supplying the file to do the concatenation before providing the file, the easiest way to solve a problem is to eliminate it right? however they are very uncooperative and have point blank refused. I've looked at building a simple javascript application that does this client side so a non-technical user could select the file using a file selector. This approach has a few problems The file could be over a GB in size and so can't be loaded straight into memory, I've tried and the browser crashes There is no means to write a file in javascript so I'd need to output the content to the screen and have the user save it (somehow) I was thinking if I could get around the filesize limitations I could just output the edited content to the page and have the user save the page as a .txt file, however I think there is a better way than using javascript that will still accommodate the users lack of technical know-how. Please consider this question to be stack agnostic, but bear in mind that a nice little shell script or python script would be deemed unsuitable for a non technical user unless there is a way of "packaging" it nicely for a non-technical user. Updates The file is too large to open in excel. The process needs to be run weekly, but it doesn't require scheduling or automation...(yet)

    Read the article

  • File Adapter FileName Macros

    - by IntegrationOverload
    I can never find these when I need them...   Macro name Substitute value %datetime% Coordinated Universal Time (UTC) date time in the format YYYY-MM-DDThhmmss (for example, 1997-07-12T103508). %datetime_bts2000% UTC date time in the format YYYYMMDDhhmmsss, where sss means seconds and milliseconds (for example, 199707121035234 means 1997/07/12, 10:35:23 and 400 milliseconds). %datetime.tz% Local date time plus time zone from GMT in the format YYYY-MM-DDThhmmssTZD, (for example, 1997-07-12T103508+800). %DestinationParty% Name of the destination party. The value comes from the message context property BTS.DestinationParty. %DestinationPartyQualifier% Qualifier of the destination party. The value comes from the message context property BTS.DestinationPartyQualifier. %MessageID% Globally unique identifier (GUID) of the message in BizTalk Server. The value comes directly from the message context property BTS.MessageID. %SourceFileName% Name of the file from where the File adapter read the message. The file name includes the extension and excludes the file path, for example, Sample.xml. When substituting this property, the File adapter extracts the file name from the absolute file path stored in the FILE.ReceivedFileName context property. If the context property does not have a value, for example, if a message was received on an adapter other than the File adapter, the macro will not be substituted and will remain in the file name as is (for example, C:\Drop\%SourceFileName%). %SourceParty% Name of the source party from which the File adapter received the message. %SourcePartyQualifier% Qualifier of the source party from which the File adapter received the message. %time% UTC time in the format hhmmss. %time.tz% Local time plus time zone from GMT in the format hhmmssTZD (for example, 124525+530).

    Read the article

  • Beginner - C# iteration through directory to produce a file list

    - by dassouki
    The end goal is to have some form of a data structure that stores a hierarchal structure of a directory to be stored in a txt file. I'm using the following code and so far, and I'm struggling with combining dirs, subdirs, and files. /// <summary> /// code based on http://msdn.microsoft.com/en-us/library/bb513869.aspx /// </summary> /// <param name="strFolder"></param> public static void TraverseTree ( string strFolder ) { // Data structure to hold names of subfolders to be // examined for files. Stack<string> dirs = new Stack<string>( 20 ); if ( !System.IO.Directory.Exists( strFolder ) ) { throw new ArgumentException(); } dirs.Push( strFolder ); while ( dirs.Count > 0 ) { string currentDir = dirs.Pop(); string[] subDirs; try { subDirs = System.IO.Directory.GetDirectories( currentDir ); } catch ( UnauthorizedAccessException e ) { MessageBox.Show( "Error: " + e.Message ); continue; } catch ( System.IO.DirectoryNotFoundException e ) { MessageBox.Show( "Error: " + e.Message ); continue; } string[] files = null; try { files = System.IO.Directory.GetFiles( currentDir ); } catch ( UnauthorizedAccessException e ) { MessageBox.Show( "Error: " + e.Message ); continue; } catch ( System.IO.DirectoryNotFoundException e ) { MessageBox.Show( "Error: " + e.Message ); continue; } // Perform the required action on each file here. // Modify this block to perform your required task. /* foreach ( string file in files ) { try { // Perform whatever action is required in your scenario. System.IO.FileInfo fi = new System.IO.FileInfo( file ); Console.WriteLine( "{0}: {1}, {2}", fi.Name, fi.Length, fi.CreationTime ); } catch ( System.IO.FileNotFoundException e ) { // If file was deleted by a separate application // or thread since the call to TraverseTree() // then just continue. MessageBox.Show( "Error: " + e.Message ); continue; } } */ // Push the subdirectories onto the stack for traversal. // This could also be done before handing the files. foreach ( string str in subDirs ) dirs.Push( str ); foreach ( string str in files ) MessageBox.Show( str ); }

    Read the article

  • load Javascript object from file

    - by megapool020
    Hi there, I asked a question in this thread Stackoverflow, and it works perfect. So tnx to all the users who gave me a reply. But now I have a other question. I would like to have the object in a seperate file, so I only need to update the file in stead of the JS file (otherwise it will be very big). I'm using JQUERY. I now looks like this (with all the info in the JS file). IBANInfo is used to fill a selectbox var IBANInfo = { "ccAL":{ countryCode:"AL", countryName:"Albani&euml;", IBANlength:"28", bankFormCode:"0 8n 0 ", accountNum:"0 16 0 " }, "ccAD":{ countryCode:"AD", countryName:"Andorra", IBANlength:"24", bankFormCode:"0 4n 4n", accountNum:"0 12 0 " }, "ccBE":{ countryCode:"BE", countryName:"Belgi&euml;", IBANlength:"16", bankFormCode:"0 3n 0 ", accountNum:"0 7n 2n" } }; //---- then this function is used to build the selectList function createSelect(){ var selectList = ''; var key; selectList += "<option value=''>Kies een land</option>\n"; for (key in IBANInfo) { if (IBANInfo.hasOwnProperty(key)) { var countryInfo = IBANInfo[key]; selectList += "<option value='"+countryInfo.countryCode+"'>"+countryInfo.countryName+"</option>\n"; } } $('#selectBox').html(selectList); } I thought I could do it like this, but I get the message undefined in my selectbox. var IBANInfo = $.get('include/countryCodes.txt'); // also tried var IBANInfo = $.getJSON('include/countryCodes.txt'); //---- then this function is used to build the selectList function createSelect(){ var selectList = ''; var key; selectList += "<option value=''>Kies een land</option>\n"; for (key in IBANInfo) { if (IBANInfo.hasOwnProperty(key)) { var countryInfo = IBANInfo[key]; selectList += "<option value='"+countryInfo.countryCode+"'>"+countryInfo.countryName+"</option>\n"; } } $('#selectBox').html(selectList); } /* the countryCodes.txt file is like this: { "ccAL":{ countryCode:"AL", countryName:"Albani&euml;", IBANlength:"28", bankFormCode:"0 8n 0 ", accountNum:"0 16 0 " }, "ccAD":{ countryCode:"AD", countryName:"Andorra", IBANlength:"24", bankFormCode:"0 4n 4n", accountNum:"0 12 0 " }, "ccBE":{ countryCode:"BE", countryName:"Belgi&euml;", IBANlength:"16", bankFormCode:"0 3n 0 ", accountNum:"0 7n 2n" } } */ What am I doing wrong. Tnx in advance

    Read the article

  • File upload using HTML file type.

    - by vaibhav
    I want to upload a file on my aspx page. I am using <form id="frmId" method="post" enctype="Multipart/form-data"> <input type="file" id="file1"/> <input type="submit" id="btnsubmit"/> </form> and in code behind I am trying to get this file. Its not letting me to get the file until I use server side input file control. I don't want to use runat="server" attribute with my file control. Do anyone know how to do this.

    Read the article

  • Rename PHP Downloaded File in File Downloader/Accelerator Applications

    - by Joe
    I have the following "download" script in PHP which basically makes an address on my website download the file for the user, eg. mysite.com/download.php?fileid=10 The question I have is, how can I send the file name to the used for the download when a user downloads the .php address with a File Downloaders/Accelerators? eg. "Content-Disposition" in this case makes the file called download.php where as I want it to be renamed to $downloadFileName as usual. // Set headers header("Cache-Control: must-revalidate, post-check=0, pre-check=0"); header("Content-Description: File Transfer"); header("Content-Type: application/force-download"); header("Content-Disposition: attachment; filename=\"".$downloadFileName."\""); header("Content-Transfer-Encoding: binary"); // Read the file from disk readfile($downloadLocation);

    Read the article

  • How do I automatically delete an Excel file after creating it on a server and returning it to the us

    - by David A Gibson
    Hello, I am creating an Excel file on a web server, using OleDb to connect the the physical (well as physical as it can be) file and appending records. I am then returning a FilePathResult to the user via MVC, and would like to delete the physical file afterwards due to data protection concerns over the appended records. I have tried using a File.Delete in a Finally clause but I get a File Not Found error which must mean the file has gone when MVC is trying to send the file to the user. I thought about creating the File as a MemoryStream but I think OleDb needs a physical file to connect to so this isn't an option. Any suggestions on how to delete the file after returning it in one operation? Thanks

    Read the article

< Previous Page | 63 64 65 66 67 68 69 70 71 72 73 74  | Next Page >