Search Results

Search found 74849 results on 2994 pages for 'file folder'.

Page 151/2994 | < Previous Page | 147 148 149 150 151 152 153 154 155 156 157 158  | Next Page >

  • I have a new file type that I would like to handle in gnome how to get file properties?

    - by Mark
    I have a new file type that I would like to handle in gnome. Establishing a new mime type, a new thumbnailer and a new application to display the file type is done. But I need a new tab on the file properties page. This tab is analogous to the tabs for exif information for jpg files or for encoding information for video codecs, that says how long the video is. The files concerned are embroidery files and the file properties needed to be displayed are things like phisical dimentions of the design, how much thread will be used and how many colours. My belief is that with current gnome 3 this is not possible, am I right? Or should I take a wider view that in Ubuntu, anything is possible, just may be a bit difficult?

    Read the article

  • Ubuntu 12.04 tilts when trying to open large excel file with libreoffice or matlab

    - by user1565754
    I have an xlsx-file of size 27.3MB and when I try to open it either in Libreoffice or Matlab the whole system slows down My processor is AMD Sempron(tm) 140 Processor (should be about 2.7Ghz) Memory I have about 1.7GB Any ideas? I opened this file in Windows no problem...of course it took a few seconds to load but Ubuntu freezes with this file completely...smaller files of size 3MB, 5MB etc open just fine... thnx for support =)

    Read the article

  • How to execute a "name.desktop" file? [duplicate]

    - by Pubudug
    This question already has an answer here: Running a .desktop file in the terminal 10 answers #!/usr/bin/env xdg-open [Desktop Entry] Version=1.0 Type=Link Name=ShareFolder Icon=/usr/share/icons/DPL/NetworkShare.png Name[en_US]=ShareFolder URL=smb://servername/sharefolder This is my .desktop file which has a URL. How do I execute this desktop shortcut in the terminal? If i double click it works perfectly, but I need to execute this in terminal. I tried Running a .desktop file in the terminal. That didn't work for me either but it does if its an "application" shortcut. I'm trying here to execute "link" .desktop file, where you define in the type section (Type=Link) and (URL=smb://servername/sharefolder)

    Read the article

  • VCS for single user using file sync service

    - by StackUnder
    I'm trying to setup a version control for my one man project. My project files are in sync thanks to live mesh (but I could be using dropbox for that matter), between my laptop, my home pc and my office pc. I'm now using Netbeans with local file history. Sometimes it helps to revert to a previous state of one file. But imagine a situation when multiple files have problems. Correct me if I'm wrong but I would have to go to every file and revert to previous "safe" state. I don't like this approach, so I'm considering using a version control between SVN and GIT. I have some previous experience with SVN (TortoiseSVN) and I know that I can create a file:// repo. So, what a want to do is setup a VCS inside my synced folder just to have the ability to "revert" to a previous version if something goes wrong. Since everything's been synced to all computers, I wouldn't ever need to run an update. The file tree organization would be the following: C:...\SyncedFolder\MyProject\ Inside MyProject folder are all the project files plus a directory that has SVN or GIT info of my project (the repo/master). What VCS is best for this situation: SVN or GIT? Does SVN need to store all files from HEAD revision, thus "duplicating" all my project inside my synced folder? Does GIT eliminates this problem? Is this the best approach?

    Read the article

  • implementing a download manager that supports resuming

    - by Idan K
    hi, I intend on writing a small download manager in C++ that supports resuming (and multiple connections per download). From the info I gathered so far, when sending the http request I need to add a header field with a key of "Range" and the value "bytes=startoff-endoff". Then the server returns a http response with the data between those offsets. So roughly what I have in mind is to split the file to the number of allowed connections per file and send a http request per splitted part with the appropriate "Range". So if I have a 4mb file and 4 allowed connections, I'd split the file to 4 and have 4 http requests going, each with the appropriate "Range" field. Implementing the resume feature would involve remembering which offsets are already downloaded and simply not request those. Is this the right way to do this? What if the web server doesn't support resuming? (my guess is it will ignore the "Range" and just send the entire file) When sending the http requests, should I specify in the range the entire splitted size? Or maybe ask smaller pieces, say 1024k per request? When reading the data, should I write it immediately to the file or do some kind of buffering? I guess it could be wasteful to write small chunks. Should I use a memory mapped file? If I remember correctly, it's recommended for frequent reads rather than writes (I could be wrong). Is it memory wise? What if I have several downloads simultaneously? If I'm not using a memory mapped file, should I open the file per allowed connection? Or when needing to write to the file simply seek? (if I did use a memory mapped file this would be really easy, since I could simply have several pointers). Note: I'll probably be using Qt, but this is a general question so I left code out of it.

    Read the article

  • Calculate total batch upload transfer percent with limited information

    - by GONeale
    Hi there, I have a system which uploads to a server file by file and displays a progress bar on file upload progress, then underneath a second progress bar which I want to indicate percentage of batch complete across all files queued to upload. Information and algorithms I can work out are: Bytes Sent / Total Bytes To Send = First progress bar (eg. 512KB of 1024KB (50%)) That works fine. However supposing I have two other files left to upload, but both file sizes are unknown (as this is only known once the file is about to commence upload, at which point it is compressed and file size is determined) how would I go about making my third progress bar? I didn't think this would be possible as I would need "Total Bytes Sent" / "Total Bytes To Send", to replicate the logic of my first progress bar on a larger scale, however I did get a version working: "Current file number we are on" / "total number of files to send" returning the percentage through the batch, however obviously will not incrementally update and it's pretty crude. So on further thinking I thought if I could incorporate the current file % with this algorithm I could perhaps get the correct progress percentage of my batch's current point. I tried this algorithm, but alas to no such avail (sorry to any math heads, it's probably quite apparent why it won't work) ("Current file number we are on" / "total number of files to send") * ("Bytes Sent" / "Total Bytes To Send") For example I thought I was on the right track when testing with this example: 2/3 (2nd of 3rd file) = 66% (this is right so far) but then when I added * 0.20 (for indicating only 20% of 2nd file has uploaded) we went back to 13%. What I need is only a little over 33%! I did try the inverse at 0.80 and a (2/3 * (2/3 * 0.2)) Can this be done without knowing entire bytes in batch to upload? Please help! Thank you!

    Read the article

  • Force Download Files Broken Headers wrong?

    - by Sinan
    if($_POST['mode']=="save") { $root = $_SERVER['DOCUMENT_ROOT']; $path = "/mcboeking/"; $path = $root.$path; $file_path = $_POST['path']; $file = $path.$file_path; if(!file_exists($file)) { die('file not found'); } else { header("Cache-Control: public"); header("Content-Description: File Transfer"); header('Content-Type: application/force-download'); header("Content-Disposition: attachment; filename=\"".basename($file)."\";" ); header("Content-Length: ".filesize($file)); readfile($file);}} As soon as i download the file and open it i get an error message. When i try to open a .doc i get the message : file structure is invalid. And when i try to open a jpg : This file can not be opened. It may be corrupt or a file format that Preview does not recognize. But when i download PDF files, they open without any problem. Can someone help me? P.s. i tried different headers including : header('Content-Type: application/octet-stream');

    Read the article

  • Read/Write Files from the Content Provider

    - by drum
    I want to be able to create a file from the Content Provider, however I get the following error: java.io.Filenotfoundexception: /0: open file failed: erofs (read-only file system) What I am trying to do is create a file whenever an application calls the insert method from my Provider. This is the excerpt of the code that does the file creation: FileWriter fstream = new FileWriter(valueKey); BufferedWriter out = new BufferedWriter(fstream); out.write(valueContent); out.close(); Originally I wanted to use openFileOutput() but the function appears to be undefined. Anyone has a workaround to this problem? EDIT: I found out that I had to specify the directory as well. Here is a more complete snippet of the code: File file = new File("/data/data/Project.Package.Structure/files/"+valueKey); file.createNewFile(); FileWriter fstream = new FileWriter(file); BufferedWriter out = new BufferedWriter(fstream); out.write(valueContent); out.close(); I also enabled the permission <uses-permission android:name="android.permission.WRITE_INTERNAL_STORAGE" /> This time I got an error saying: java.io.IOException: open failed: ENOENT (No such file or directory)

    Read the article

  • PHP: parse $_FILES[] data in multidimesional array

    - by superUntitled
    I having been looking around for an answer to this and have not found an answer anywhere, I am hoping someone has done this before! I have a form that allows for dynamic duplication of the form fields. The form allows for file uploads and text input, so the data is sent in both $_POST and $_FILES arrays. The the initial set of inputs look like this: <input type="text" name="primary[1][text]" /> <input type="file" name="primary[1][file]" /> <input type="text" class="a" name="secondary[1][text][]" /> <input type="file" name="secondary[1][file][]" /> When duplicated the fields are incremented, they look like this: <input type="text" name="primary[2][text]" /> <input type="file" name="primary[2][file]" /> <input type="text" class="a" name="secondary[2][text][]" /> <input type="file" name="secondary[2][file][]" /> To complicate matters, the "secondary" form fields can also be duplicated (thus the [] at the end of the secondary name array. How can I parse the posted $_FILES array? I have tried something like this: foreach ($_FILES['question'] as $f_num) { echo $f['file']['name']; } but I get an "Undefined index: file... " error.

    Read the article

  • How to give INSTALLDIR folder permission in WIX?

    - by tete
    I am designing a WIX 3.6 installer project, during the installation we need to grand the user create file permission to the install folder(INSTALLDIR, especially with the default install folder, the Program Files, the user normally can't create file in the installation. We've experienced some failures). I guess it can be achieved by setting a Permission element, with CreateFile property. However, the INSTALLDIR is a directory, and only such elements as CreateFolder, File, FileShare, Registry, ServiceInstall can have permission element. So could anyone tell me how to do that? My directory declaration is something like this: <Directory Id="TARGETDIR" Name="SourceDir"> <Directory Id="ProgramFiles64Folder"> <Directory Id='MANUFACTUREFOLDER' Name='$(var.ManufacturerName)'> <Directory Id="INSTALLDIR" Name="$(var.ProductName)"> Thanks!

    Read the article

  • ASP.NET Load unmanaged dll from bin folder

    - by Quandary
    Question: I use an embedded Firebird database in ASP.NET. Now, Firebird has a .NET wrapper around native dlls. The problem is, with the .NET compilation and execution process, the dlls get shadow copied to a temporary folder. Unfortunately, only the .NET dlls, and not the native dll. See http://msdn.microsoft.com/en-us/library/ms366723.aspx for details. Now, this makes it necessary to put the unmanaged dll somewhere into the system32 directory (or any other directory in the path environment variable). Now, I want to change the wrapper/native dll (opensource), so it loads the dll also if they are only in the bin folder. Now, my problem is, how can I, in .NET, load an unmanaged dll from an absolute path ? The absolute path is determined at runtime, not at compile-time...

    Read the article

  • Can't read from aspnet_client folder for crystal reports

    - by Hank Allen
    I created a little ASP .Net app to run Crystal Reports. It runs fine from VS 2008 but not when deployed to IIS on Windows 7. The toolbar images are not rendered. The problem seems to be I can't read from the aspnet_client folder even though I've made into a virtual directory. I can't even read images I put in there just to see if the folder can be read from an ASP page. I also made sure the IIS user can read from there. I'm stumped.

    Read the article

  • htaccess not called when the url point to an existing folder

    - by Eldad
    Hi, I'm running zend server on windows 7. I'm using the htaccess from jooml: Options +FollowSymLinks RewriteEngine On RewriteCond %{QUERY_STRING} mosConfig_[a-zA-Z_]{1,21}(=|\%3D) [OR] RewriteCond %{QUERY_STRING} base64_encode.*\(.*\) [OR] RewriteCond %{QUERY_STRING} (\<|%3C).*script.*(\>|%3E) [NC,OR] RewriteCond %{QUERY_STRING} GLOBALS(=|\[|\%[0-9A-Z]{0,2}) [OR] RewriteCond %{QUERY_STRING} _REQUEST(=|\[|\%[0-9A-Z]{0,2}) RewriteRule ^(.*)$ index.php [F,L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_URI} !^/index.php RewriteCond %{REQUEST_URI} (/|\.php|\.html|\.htm|\.feed|\.pdf|\.raw|/[^.]*)$ [NC] RewriteRule (.*) index.php RewriteRule .* - [E=HTTP_AUTHORIZATION:%{HTTP:Authorization},L] when I'm calling this url: http://localhost/ABC/ the data is been redirect to index.php but if I'm creating the folder ABC the server is showing the ABC folder content and not redirecting the data back to index.php. how can I prevent that, I want all the calls data to be directed into index.php? Thanks

    Read the article

  • Getting a sent MailMessage into the "Sent Folder"

    - by Robert Reid
    I'm sending MailMessages with an SmtpClient (being delivered successfully) using an Exchange Server but would like my sent emails to go to the Sent Folder of the email address I'm sending them from (not happening). using (var mailMessage = new MailMessage("[email protected]", "[email protected]", "subject", "body")) { var smtpClient = new SmtpClient("SmtpHost") { EnableSsl = false, DeliveryMethod = SmtpDeliveryMethod.Network }; // Apply credentials smtpClient.Credentials = new NetworkCredential("smtpUsername", "smtpPassword"); // Send smtpClient.Send(mailMessage); } Is there a configuration I'm missing that will ensure all of my sent emails from "[email protected]" arrive in their Sent Folder?

    Read the article

  • Change Check Out Folder for checked out files in SourceSafe

    - by Town
    I had to rebuild my machine and went from XP to Windows 7. I've now got a bit of an issue: I had files checked out in SourceSafe previously, which I still have copies of in the local folder on my new install. However, SourceSafe still has them checked out to the old XP folder (c:\documents and settings etc) whereas the files now reside in c:\Users. Pending Checkins in Visual Studio now thinks I have nothing checked out, and SourceSafe declares that the files are checked out to me under the c:\documents and settings\ path. Is there any way to tell SourceSafe to simply "look over there" for the files instead? It seems to work with individually undoing and redoing checkout on the files, but that's a lengthy process and one I'd like to avoid if possible. If I simply checkout the files individually then it lists them as checked out to me twice, one for each of the locations. Any pointers would be very much appreciated!

    Read the article

  • problem showing pictures stored outside web root folder

    - by David
    On a website users can upload pictures. For security reasons these are stored outside the webroot (public_html) folder. When I need to display the picture, I send the headers and have "readfile" read and output the picture data, like so: header("Pragma: public"); header("Expires: 0"); // set expiration time header("Cache-Control: must-revalidate, post-check=0, pre-check=0"); header('Content-type: image/jpg'); header('Content-Length: ' . $filesize); readfile($path_url . '/' . $photo); This works great, but the site is growing and this is starting to be a burden on the server. Question: is there a way to send the picture or picture data to the user, without the server first having to load the picture (obviously with the picture still being stored outside the webroot folder)? Thanks! David

    Read the article

  • Javascript - why do I sometimes fail to read file content with GDownloadUrl?

    - by Daj pan spokój
    Hi everybody. I try to read some file with google's GDownloadUrl and it works only from time to time. failure means fileRows == "blah blah" success means fileRows == (real file content) I've noticed, however, that when I cease (with Firebug) the execution on line 3 for a couple of seconds, it succeeds more often. Maybe it is some kind of threading bug, then? Do You guys have any tip or idea? 1 var fileContent = "blah blah"; 2 availabilityFile = "input/available/" + date + ".csv"; 3 GDownloadUrl(availabilityFile, function(fileData) { 4 fileContent = fileData; 5 }); 6 fileRows = fileContent.split("\n");

    Read the article

  • Android "gen" folder and SVN - bitter enemies.

    - by Benju
    It seems that I accidentally checked in my "gen" folder from an Android project (this folder contains the R.java generated class). When I realized I did this I deleted it from SVN and tried to ignore it. Now I am now getting the error... "Could not add gen to the ignore list! Working copy 'C:\code\guru' locked. When I try to run a cleanup command I get this... Cleanup failed to process the following paths: -C:\code\guru 'C:\code\guru\gen' is not a working copy directory. When I try to run a resolve I get this... Working copy 'C:\code\guru' locked Please execute the 'Cleanup' command. We are currently on SVN 1.6 on the server.

    Read the article

  • Secondary Domain Adds Extra Folder in URL during Postbacks

    - by Joshua
    My ASP.NET Website (C#, 3.5 framework, IIS7) is hosted at GoDaddy. There are multiple sites on the account. Currently when I perform postbacks or Response.Redirects on a secondary web site, the following URL appears in the address bar: www.mywebsite.com/webfolder/default.aspx Where the "webfolder" is the sub-directory on the server where the web site is hosted (i.e. SeverRoot/webfolder). The site seems to work with or without the folder in the URL. Is there a way to remove the folder from the URLs during postback? I think I have to use URL Rewriting (which GoDaddy supports using Microsoft's Rewrite Module) but I'm not sure how.

    Read the article

  • iCloud + Storage of media in iPhone Documents folder

    - by Michael Morrison
    I, like many developers, got an email from Apple recently that stated we should move our data from the documents directory into another folder to permit more streamlined backup to iCloud. In recent testing it appears that [your app] stores a fair amount of data in its Documents folder. Since iCloud backups are performed daily over Wi-Fi for each user's iOS device, it's important to ensure the best possible user experience by minimizing the amount of data being stored by your app. Marco Arment, of instapaper fame, has a good take on the issue, which is that the recommended location for storing downloadable files is in /Library/Caches. However, the problem is that both /tmp and /Caches can be 'cleaned' anytime the OS decides that the device is running low on storage. If your app is cleaned then the data downloaded by your app and stored by your user is gone. Naturally, the user will blame you and not Apple. What to do?

    Read the article

  • Selecting the App Pool for a web custom folder in a Web Setup Project (Visual Studio)

    - by Oobertom
    I've got a Web Setup Project in VS2008 that is taking the files for two web applications and turning them into a single setup package. This works and I have got it asking for the user to select the application pool but the application pool is only being applied to the project sat in the Web Application Folder and not the one in the Web Custom Folder that I added for the second project. How do I force it to set both applications to the same app pool? Thanks in advance for any help on this it seems like it should be simple but I've been mucking round with it for ages to no avail.

    Read the article

< Previous Page | 147 148 149 150 151 152 153 154 155 156 157 158  | Next Page >