Search Results

Search found 89549 results on 3582 pages for 'large file support'.

Page 38/3582 | < Previous Page | 34 35 36 37 38 39 40 41 42 43 44 45  | Next Page >

  • Implementing Qt File Dialog with a Different File System Library (boost)

    - by knight
    Hi, I am writing an application which requires me to use another file system and file engine handlers and not the qt's default ones. Basically what I want to be able to do is to use qt's file dialog but have an underlying file system handler (for example built using boost file system library) of mine handling all the operations with regards to file and directory operations within that dialog. I have already written a custom file engine which handles some of the operations but I am now stuck with Qt's file system model and the file system watcher engine, as I need to have the signals transmitted for this custom file engine. Seems like I have a daunting task ahead. Am I heading in the right direction? Is there any other simpler way that I could implement this? Can anyone give me any idea on how to proceed. I was thinking of looking into proxy models but not sure if that would work. Thanks in advance for any help.

    Read the article

  • How can I set 'Print to File' as my default printing option?

    - by edm
    At the moment when I print, my Deskjet-3050 is selected as the default printer. I would like 'Print to File' to be the default 'printer' without using cups-pdf I specifically do not want to use cups-pdf because of the way it renders text (see below). I am not entirely sure what it is doing but it seems as though it renders the text as bitmaps and embeds them in pdf (as I am not able to highlight/copy/search embedded text as I am using a standard Print to File pdf). N.B. this is not a dupe of: Can I make PDF the default for 'print to file'

    Read the article

  • Is it possible to save output from a command to a file subsequent the command already has been executed?

    - by NES
    Does an elegant way exists to save the output of a command subsequent to a file as long as the terminal window is open? I mean once the command has been executed in the terminal. The output is still there in terminal. Now i could copy & paste all the lines and save it to a file. But perhaps does a method exist to somehow write the output buffer of a terminal window to a file or even better the output of an already executed command?

    Read the article

  • Android , Read in binary data and write it to file

    - by Shpongle
    Hi all , Im trying to read in image file from a server , with the code below . It keeps going into the exception. I know the correct number of bytes are being sent as I print them out when received. Im sending the image file from python like so #open the image file and read it into an object imgfile = open (marked_image, 'rb') obj = imgfile.read() #get the no of bytes in the image and convert it to a string bytes = str(len(obj)) #send the number of bytes self.conn.send( bytes + '\n') if self.conn.sendall(obj) == None: imgfile.flush() imgfile.close() print 'Image Sent' else: print 'Error' Here is the android part , this is where I'm having the problem. Any suggestions on the best way to go about receiving the image and writing it to a file ? //read the number of bytes in the image String noOfBytes = in.readLine(); Toast.makeText(this, noOfBytes, 5).show(); byte bytes [] = new byte [Integer.parseInt(noOfBytes)]; //create a file to store the retrieved image File photo = new File(Environment.getExternalStorageDirectory(), "PostKey.jpg"); DataInputStream dis = new DataInputStream(link.getInputStream()); try{ os =new FileOutputStream(photo); byte buf[]=new byte[1024]; int len; while((len=dis.read(buf))>0) os.write(buf,0,len); Toast.makeText(this, "File recieved", 5).show(); os.close(); dis.close(); }catch(IOException e){ Toast.makeText(this, "An IO Error Occured", 5).show(); } EDIT: I still cant seem to get it working. I have been at it since and the result of all my efforts have either resulted in a file that is not the full size or else the app crashing. I know the file is not corrupt before sending server side. As far as I can tell its definitely sending too as the send all method in python sends all or throws an exception in the event of an error and so far it has never thrown an exception. So the client side is messed up . I have to send the file from the server so I cant use the suggestion suggested by Brian .

    Read the article

  • Reading from a very large table using multiple threads (Java ) and writing them to a single file

    - by user2534926
    I am currently facing a situation where i have a table with almost 80 millions data and i have to take a dump of that table and store it into a csv file. Currently i am using a not so professional approach( with a perl script+DBI interface , printing the values to stdout and redirecting to a csv file). Now i am planning to use java threading approach. Can you suggest a way forward. Thanks in advance

    Read the article

  • What file format/database format does Picasa use?

    - by Raymond
    I am trying to figure out what file format the .db file and .pmp files are. I tried using db_dump (Berkeley DB) for the .db files, but it seems that they are not Berkeley DB, or of an older version. I have no idea what the .PMP files are. Directory of C:\Users\me\AppData\Local\Google\Picasa2\db3 6/09/2010 08:07 PM 303,748 imagedata_uid64.pmp 1/18/2010 10:34 PM 4,885 imagedata_unification_lhlist.pmp 6/09/2010 10:55 PM 155,752 imagedata_width.pmp 6/09/2010 10:55 PM 1,286,346,614 previews_0.db 6/10/2010 10:06 AM 467,168 previews_index.db Any help appreciated.

    Read the article

  • robocopy transfer file and not folder

    - by Bill McKay
    I'm trying to use robocopy to tranfer a single file from one location to another but robocopy seems to think I'm always specifying a folder. Here is an example: robocopy "c:\transfer_this.txt" "z:\transferred.txt" But I get this error instead: 2009/08/11 15:21:57 ERROR 123 (0x0000007B) Accessing Source Directory c:\transfer_this.txt\ (note the '\' at the end of transfer_this.txt) But if I treat it like an entire folder: robocopy "c:\folder" "z:\folder" It works but then I have to transfer everything in the folder. How can I only transfer a single file with robocopy?

    Read the article

  • Digital Asset Management, iPhoto / Aperture server... alternative

    - by Sisyphus
    Afternoon, Clients, 10 : All Apples running either Leopard or Snow Leopard Server : Snow Leopard server, (and I have a old Dell Poweredge 650 at home running Gentoo 2.6, if anybody as a Linux solution). The situation: I work in small design company with 8 people, at present we are looking to consolidate all our image files onto one location, at present we each use our preferred single user DAM solution, be it, Adobe Bridge, iPhoto/Aperture (some don't bother at all) The filetypes commonly used are .psd, .pdf, .eps, .tiff, .jpg and RAW image files. Ideally what is needed: Centralised on one server, but allows us to search via spotlight (not essential, but would be nice) Include searchable metadata information such as date, location, title Open-source or as low cost as possibly Allow simultaneous users to import files So far, I have looked at a few open source DAM, systems, such as Razuna, Gallery (not strictly DAM), ResourceSpace, Notre-DAM, while these are brilliant and open-source, they don't integrate as smoothly with the Desktop as iPhoto and aperture. For iPhoto and aperture, I have tried creating a Shared library on the server (a tad laggy), and also using a drive with no permissions, put a library and letting each client read from it, however if they want to put images onto the library only, it's only supports one user at a time writing to the library... Any ideas what could fulfill our needs? Or is it time to bite the bullet for FinalCut Server? Thanks in advance.

    Read the article

  • Atlassian Crucible very slow on large repository

    - by Mitch Lindgren
    Hi everyone, My company has been running a trial of Atlassian Crucible for some months now. For repositories where it's working properly, users have given very positive feedback about the tool. The problem I'm having is that we have several different projects, each with its own repository, and some of those repositories are very large. One repository in particular has a large number of branches and probably around 9,000 files per branch. Browsing that repository in Crucible is extremely slow. Crucible is running on a CentOS VM. The VM has 4GB of RAM, and I've set Crucible's maximum at 3GB, of which it is currently using 2GB. I've brought this up in a support ticket with Atlassian, and they suggested the following: In particular because you have a rather large SVN repository you will likely find that Fisheye will be creating a large index file on disk. To help improve performance a few things you can try are: Increasing the available memory available to Fisheye (see the document above). Migrating to an external database: confluence.atlassian.com/display/FISHEYE/Migrating+to+an+External+Database Excluding files and directories from your index that aren't needed: confluence.atlassian.com/display/FISHEYE/Allow+(Process) (Sorry for not hyperlinking; don't have the rep.) I've tried all of these things to an extent, but so far none have helped greatly. I was originally running Crucible on a Windows box with 2GB of RAM using the built in HSQL DB. Moving to MySQL on CentOS saw a performance increase for some repositories, and made Crucible much more stable, but did not seem to help much with our biggest repository. There are only so many files/branches I can exclude from indexing while maintaining the tool's usefulness. That being the case, does anyone have any tips on how to speed up Crucible on large repositories, without investing in insanely powerful hardware? Thanks! Edit: To clarify, since I didn't mention it explicitly above, I am using FishEye.

    Read the article

  • How to make ClearType looks ok even with large fonts

    - by Sorin Sbarnea
    I discovered that on large/huge fonts ClearType does have a negative impact. Just take a look at http://patterns.dataincubator.org/book/ page (check title with huge fonts versus normal text). If you are on Windows 7 you can use Win+Plus/Minus to zoom in/out. I'm looking for a configuration that would make both small fonts and large fonts look well. The system is Windows 7 but I suppose you could replicate the problem on Vista and even XP if you activate ClearType. Current results: ClearType on and tunned - small fonts looking good and large one looking bad ClearType off - small fonts looking bad and large one looking ok (smoothed)

    Read the article

  • Extract large zip file (50 GB) on Mac OS X

    - by chingjun
    I was trying to move the files to another hard drive. So I archived all my photos in one large ZIP file using the Mac OS X built-in compress function. But the file failed to extract. I've tried many programs, but none of the programs I tried were able to extract the file. I've tried Mac OS X's extract utility, StuffIt Expander, 7-Zip (command line), all failed. Mac's archive utility and StuffIt don't seem to support large files, and 7-Zip's command line version gave an error stating unsupported archive. I have no luck in Windows either as many of my files have Chinese filenames, and couldn't extract to the correct name under Windows. Are there some programs that can support large files, can handle files compressed using Mac OS X's compress function, and can support UTF-8 filename? With or without GUI is fine. Update Well, I had made the wrong decision to compress the files, and it's already too late. I thought I should be able to extract the file if I could compress it. It's too late, the original copies are gone, only a large ZIP file left here. I have tried using 'unzip', but it says End-of-central-directory signature not found. I guess it doesn't have large file support as well. I would try the Windows Vista method as stated by SuperMagic, but I need to borrow a computer for that. Anyway, thank you everyone, but please provide more suggestions on what software that could possibly extract that file.

    Read the article

  • Free-as-in-beer binary file format inspector

    - by fbrereto
    I am looking for a utility that gives me the ability to specify a binary file format and then interpret a file of bytes according to that format. (Something along the lines of the 010 Editor, but infinitely more cost-effective). Something that runs on Mac OS X would be preferred, but I'm interested to see what all is out there in general (while more of a hassle I'd be willing to run a tool on Windows if it were superior.) What's your preference?

    Read the article

  • Utility to unmap a network drive when the screen saver starts

    - by JimR
    I'm looking for a way to unmap network drives when the screen saver turns on. I have a few users that share an external, encrypted drive (Samba share, not windows) and they have a requirement to disconnect the drive mapping when the local machine is idle. I'd also like it to warn them if there are open files on the mapped drive, if possible. There is also a requirement to force the password to be reentered before mapping when the machine comes back from idle. Is there a Windows setting or utility out there in the wild that meets these requirements?

    Read the article

  • Copying large files from USB devices to the internal hard drive fails on Mac OS

    - by John M. P. Knox
    I have a second-generation 13" MacBook Air running Mac OS X 10.6.6 with a 2.13 GHz processor, 4 GB of RAM, and a 256 GB SSD hard disk. I often get failures when I attempt to copy a large file or large collection of files from an external USB drive (typically a "Firewire" generation Drobo) to the internal drive. The failure behaves almost exactly as if I had pulled the USB cable from the computer in mid-transfer. I get a warning that I have removed the hard disk improperly. After this event, the drive no longer appears mounted in the finder, and I have to unplug and reinsert the USB cable to mount the drive again. I have also seen a similar problem when using Aperture 3 to import a large number of photos and videos from a USB Compact Flash card reader. The import will fail and I will have to unplug the Card Reader and import the missing items. Oddly, reversing the direction of the copy seems pretty reliable. I've never had a problem copying a large file to a USB device, meaning that I have quite a few large files which are stranded on my Drobo. Model Identifier: MacBookAir3,2 Boot ROM Version: MBA31.0061.B01 I have seen a similar issue reported on Apple's website: http://discussions.apple.com/thread.jspa?threadID=2648590&tstart=0 The only suggested resolutions there seems to be switching to another form of connectivity (e.g. firewire, which does not exist on MacBook Air), or downgrading to Mac OS 10.6.4, or reverting the USB kernel extensions to the 10.6.4 versions: http://discussions.apple.com/message.jspa?messageID=12566073#12582956 I'm not too keen on the idea of downgrading kernel extensions. Does anyone know of a hardware revision without this issue that I can trade up to? Are there any other potential solutions out there?

    Read the article

  • SQL Server 2005, Huge LDF file.

    - by Scott Jackson
    Hi, I have a database running on SQL Server 2005. The database is 20Gb and the LDF file is 35Gb ! I'm now running low on disk space and want to shrink the log file. How can I do this and how can I stop this happening again ? Many thanks Scott

    Read the article

  • What OS would you use for a simple fileserver and why?

    - by barfoon
    Hey everyone, Im looking to set up a simple fileserver: 5 - 7 clients - Mixed Windows, Linux, Mac OSX - connecting over wireless and wired Serving ~200GB content - Photos, MP3's, ISO's etc What OS would you recommend for this fileserver? I understand XP limits the number of when connecting to different shares so this probably isnt the best choice. Any recommendations are appreciated. Thank you,

    Read the article

  • Postfix won't pipe to PHP file through aliases file

    - by jfreak53
    I'm trying to pipe from postfix to a command. According to Postfix logs it worked, but when I check the command it didn't. This is a fresh postfix install. This is my alias file: # See man 5 aliases for format postmaster: root support: "| /usr/bin/php -q /var/www/pipe/pipe.php" I run sendmail [email protected] then type it and then on a separate line type . and it goes. I check the postfix log /var/log/mail.log and this is what it states: Nov 2 15:32:33 server3 postfix/local[13284]: 42C429E0B5: to=<[email protected]>, relay=local, delay=156, delays=156/0.01/0/0.05, dsn=2.0.0, status=sent (delivered to command: /usr/bin/php -q /var/www/pipe/pipe.php) So according to that it worked, but it doesn't. If I run echo 'text' | /usr/bin/php -q /var/www/pipe/pipe.php it does work just fine. Any ideas what I did wrong? I know piping is working, I originally checked it by running that command above WITHOUT the quotes, so just support: | /usr/bin/php -q /var/www/pipe/pipe.php What it did there was append my email header and all to the file pipe.php. So I know postfix was piping it, but when I put in the quotes it says it's going but it's not according to my script.

    Read the article

  • Mac keyboard shortcut to rm file

    - by MattDiPasquale
    What's the Mac keyboard shortcut to rm a file? I know command + delete sends it to trash, but I want to permanently delete it, say with command + fn + delete. UPDATE: It doesn't look like there is one. So, I want to create a service with Automator and then assign a keyboard shortcut to it from System Preferences. I can get to Automator - Service - Service receives selected files or folders in Finder.app, but how do I write the script that then runs rm -rf #{file/folder name}?

    Read the article

< Previous Page | 34 35 36 37 38 39 40 41 42 43 44 45  | Next Page >