Search Results

Search found 37883 results on 1516 pages for 'sparse files'.

Page 27/1516 | < Previous Page | 23 24 25 26 27 28 29 30 31 32 33 34  | Next Page >

  • Uploading Files Using ASP.NET Web Forms, Generic Handler and jQuery

    - by bipinjoshi
    In order to upload files from the client machine to the server ASP.NET developers use FileUpload server control. The FileUpload server control essentially renders an INPUT element with its type set to file and allows you to select one or more files. The actual upload operation is performed only when the form is posted to the server. Instead of making a full page postback you can use jQuery to make an Ajax call to the server and POST the selected files to a generic handler (.ashx). The generic handler can then save the files to a specified folder. The remainder of this post shows how this can be accomplished.http://www.bipinjoshi.net/articles/f2a2f1ee-e18a-416b-893e-883c800f83f4.aspx      

    Read the article

  • Approach to retrieve files from server

    - by Aerus
    I'm in the process of making a Java application with a corresponding update application. At any given time the user may want to update the application and the updater will ask for a list of files of the latest release. Based on this list, the updater can determine which files need to be downloaded to complete the update. I now have 2 approaches to solve this, but i would like to know what approach will put the least stress on my application and server. I could send a list of files i want to download to my server and the server zips the files and simply returns this compressed file to the application. The updater sents a request for each seperate file to the server, which simply returns the file The application will be used mainly in Belgium and The Netherlands and connections/bandwidth tend to be pretty decent in here. The average size of a single file should be around 100Kb and at most 1Mb. I expect an update to have anywhere between 10 to 50 new files. I expect at most 100 persons/day to update the application, i.e. in the week when a new version is released. I hope this is enough information to sketch my problem and any advice is welcome. If there is another common way to tackle this, i'd be glad to hear it.

    Read the article

  • No inodes left error, df -i command says contrary

    - by abhinavkulkarni
    I copied a lot of files in my mounted Windows drive from Ubuntu and I subsequently ran into Error opening file '/media/windows/<some-file-path>': No space left on device error. I checked the output of df -i command to see if I had ran out of inodes for the mounted Windows drive: Filesystem Inodes IUsed IFree IUse% Mounted on /dev/sda5 2363904 504119 1859785 22% / udev 207621 522 207099 1% /dev tmpfs 211487 450 211037 1% /run none 211487 3 211484 1% /run/lock none 211487 7 211480 1% /run/shm none 211487 19 211468 1% /run/user /dev/sda2 458686680 2588876 456097804 1% /media/windows As above output shows, lots of inodes are available for /media/windows drive. I have plenty of disk space left - around 500GB. What's the problem then?

    Read the article

  • Converting MOD files to quicktime or mpeg for adobe premiere pro

    Ive been Editing lots of videos lately. My company got a video camera: Canon Legria FS200. It saves the movies in a digital format as MOD files. Unfortunately, Adobe Premiere doesnt work with these files. I needed software to convert MOD files to QuickTime or mpeg files. I found a good free one : Its called Mpeg StreamClip:  It works well. and its pretty fast. And its Free. Whats not to like? ...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Preferred apache permissions for www files with several authors

    - by user1316464
    I can't for the life of me figure out how to design my permissions scheme for my apache files. My requirements seem pretty simple: Apache should have standard permissions of RX for Directories and R for files Web authors should have RWX for Directories and RW for files Don't want to give any access to "other" Want new files/folders to inherit the proper permissions Here are the schemes I've tried 570 for directories and 460 for files Owner: Apache Group: Webdev The problem here is that new files created by users int the Webdev group are owned by user:Webdev and Apache can't read them. If Apache were in the group Webdev then it would also have the wrong permissions (ie it would have Write permissions to files) 750 for directories and 640 for files Owner: Webdev Group: Apache (Webdev is a member of Apache) The problem here is that there is only one webdev account and I have multiple people who need access to contribute. In theory this would work with only one developer if Webdev were also a member of the Apache group. Any ideas?

    Read the article

  • Google Site Search (commercial) not indexing files in sitemap

    - by melat0nin
    I have a client for whom we have purchased Google Site Search. It works well for HTML pages served by the CMS, but files aren't being reliably indexed. I wrote a script to generate an XML feed (sitemap) of all the files in the CMS which I've plugged in to Google Webmaster Tools for the site. It says that for that sitemap 923 URLs have been submitted, but only 26 have been indexed. The client relies heavily on searching within files, which is why we decided to use Google search, so this is a bit of a problem. Many of the files aren't linked to from any page on the site, as they are old and therefore don't merit having a page of their own. But they still need to be accessible through search for archiving purposes. The file archive xml can be found at www.sniffer.org.uk/file-archive and the standard xml sitemap (of pages) can be found at www.sniffer.org.uk/sitemap.xml. Any thought would be much appreciated!

    Read the article

  • MacMini (running Ubuntu 14.04) loses wlan connection when uploading larger files (several 100 mb) via ownCloud

    - by ManekenT
    I installed Ubuntu 14.04 on an old MacMini with the intention of running it as a homeserver. Additionally I installed ownCloud and tried to sync some files both from a laptop running elementaryOS and a desktop running windows 7. Syncing smaller files workes like a charm (4000 files at <10mb each) but when it comes to bigger files (1 GB ubuntu iso e.g.) the upload failes after 20-100mb. I can't ping the server anymore and the server can't ping me. It still shows up in our router as connected though. Disconnecting and reconnecting the wlan connection fixes the issue until the next attempt at syncing. Edit: I also had to install the wlan driver with this manual: https://help.ubuntu.com/community/MacBookPro8-2#Wireless

    Read the article

  • can rsyslog transfer files present in a directory

    - by Tarun
    I have configured rsyslog and its working fine as its intended to be this is the conf files: Server Side: $template OTHERS,"/rsyslog/test/log/%fromhost-ip%/others-log.log" $template APACHEACCESS,"/rsyslog/test/log/%fromhost-ip%/apache-access.log" $template APACHEERROR,"/rsyslog/test/log/%fromhost-ip%/apache-error.log" if $programname == 'apache-access' then ?APACHEACCESS & ~ if $programname == 'apache-error' then ?APACHEERROR & ~ *.* ?OTHERS Client Side: # Apache default access file: $ModLoad imfile $InputFileName /var/log/apache2/access.log $InputFileTag apache-access: $InputFileStateFile stat-apache-access $InputFileSeverity info $InputRunFileMonitor #Apache default Error file: $ModLoad imfile $InputFileName /var/log/apache2/error.log $InputFileTag apache-error: $InputFileStateFile stat-apache-error $InputFileSeverity error $InputRunFileMonitor if $programname == 'apache-access' then @10.134.125.179:514 & ~ if $programname == 'apache-error' then @10.134.125.179:514 & ~ *.* @10.134.125.179:514 Now in rsyslog can I instead of defining separate files can I give the complete directory so that the client sends all the log files automatically present in the directory /var/log/apache2 and on syslog server side these files gets automatically stored in different filenames?

    Read the article

  • Automatically minify and combine JavaScript and CSS files in any web site

    This article describes a complete package that speeds up loading of JavaScript, CSS, and images in an ASP.NET web site. It accomplishes this by minifying JavaScript and CSS files on the fly (and caching the minified content), combining JavaScript and CSS files, making it easy to load files from cookieless domains, and much more. This article has full step by step installation instructions for both IIS 7 and IIS 6 (each version requires different additions to web.config). It will also detail how to use all the features, complete with short examples.

    Read the article

  • How to get files that have been added/modifed in a batch file

    - by Chris L
    I have the following batch file which concatenates all of the files in a folder that have a .sql ending. set func=%~dp0%Stored Procedures\*.sql for %%i in (%func%) do type "%%i" >>InstallScript.sql We use SVN as our repository, and we're using branching. Currently the script concatenates all the .sql files, even the ones that haven't changed. I'd like to change it so it only concatenates files that have been modified and/or created after the branch was created. We can do that by looking at the datetime on the .svn folder in each folder(there's a Stored Procedure, View, Function subfolders). But I don't know how to do that with batch files. Ideally something like this(psuedo code): set func=%~dp0%Stored Procedures\*.sql set branchDateTime=GetDateTime(%~dp0%.svn) <- Gets the datetime when the .svn folder was created for %%i in (%func%) { if(%%i.LastModifiedOrCreated > branchDateTime) do type "%%i" >> InstallScript.sql }

    Read the article

  • Ubuntu 12.04 External HD Issues

    - by Anthie Georgiadi
    I am experiencing one more problem with my Ubuntu 12.04... I have an External HD (500GB) which is NTFS formatted. I connected this HD on my Ubuntu 12.04 and copied some files. When I connected the HD on a Windows 7 machine I could find the folders (the were visible) but I wasn't able to copy/cut/delete them. However when I opened the folders I could handle the material (mp3s) in them. Does anybody know how can I fix this? How can I fully access and modify folders copied from Ubuntu via a Windows machine? Thanks in advance!

    Read the article

  • Why is MediaWiki auto-linking the word “files”

    - by dfrankow
    Our MediaWiki installation is auto-linking the word "files". So Here are some files: a, b, c would result in the word "files" being linked to http://ourhost/mediawiki/files. Why is that happening and how do I make it stop? I can use the nowiki tag, but perhaps it does not surprise you that the word "files" appears often, and it is aggravating to use that tag all the time. Here is some info on our MediaWiki installation from Special:Version. Yes, it's old. Installed software Product Version MediaWiki 1.16.5 PHP 5.2.14-pl0-gentoo (apache2handler) MySQL 5.0.84 Installed extensions Parser hooks GoogleDocs4MW (Version 1.1) Adds tag for Google Docs' spreadsheets display Jack Phoenix SyntaxHighlight (Version 1.0.8.6) Provides syntax highlighting using GeSHi Highlighter Brion Vibber, Tim Starling, Rob Church and Niklas Laxström WebServiceSequenceDiagram(Version 1.0) Render inline sequence diagrams using websequencediagrams.com Eddie Olsson Other MWSearch MWSearch plugin Kate Turner and Brion Vibber Extension functions efLucenePrefixSetup Parser extension tags gallery, googlespreadsheet, html, nowiki, pre, sequencediagram, source and syntaxhighlight Parser function hooks anchorencode, basepagename, basepagenamee, defaultsort, displaytitle, filepath, formatdate, formatnum, fullpagename, fullpagenamee, fullurl, fullurle, gender, grammar, int, language, lc, lcfirst, localurl, localurle, namespace, namespacee, ns, nse, numberingroup, numberofactiveusers, numberofadmins, numberofarticles, numberofedits, numberoffiles, numberofpages, numberofusers, numberofviews, padleft, padright, pagename, pagenamee, pagesincategory, pagesize, plural, protectionlevel, special, subjectpagename, subjectpagenamee, subjectspace, subjectspacee, subpagename, subpagenamee, tag, talkpagename, talkpagenamee, talkspace, talkspacee, uc, ucfirst and urlencode

    Read the article

  • Deleted files not increasing available free space on Ubuntu (as reported by df -h)

    - by Homunculus Reticulli
    I am writing data munging scripts (python and bash), to munge data and import large quantities of text files into a database. I am currently in the test phase, so I am generating several K's of files and deleting them (the files consume about 20G of space). After a test run, I delete the files (sometimes without having imported into the database). I notice that there is a steady decrease in the amount of free space on my disk (as reported by df -h). I don't understand this, as I use rm * (in the data directory), and in the cases where I use Nautilus, I empty the Trash bin as well. Similarly, I notice that when I import the data into the (postgresql) database, and then delete the data from the tables using DELETE FROM tablename;, the size consumed in the postgresql data directory does not go down either. Currently, I have lost approximately 200G from hard drive, and I need to reclaim that - but don't know what to do to reclaim it - any ideas?. I am running Ubuntu 10.0.4 LTS + postgresql 8.4

    Read the article

  • XNA stopped compiling my model x files

    - by HuseyinUslu
    So I've a 3d game project I'm working on and I'm using 2 model files (SkyBlock.x and AimedBlock.x). So until now everything was all good and my models files were compiled all okay and I was able to use them within my game. With the latest changes (which I don't know what caused it really) - XNA stopped compiling my model files and instead only outputs files; AimedBlockxnb - 1kb SkyDome.xnb - 1kb SkyDomeTexture.xnb - 1389 kb SkyDomeTexture_0.xnb - 419 kb So I created a test XNA game project and moved all my asset's to new solution content project's and tried compiling them and saw that they're all good. AimedBlockxnb - 2kb SkyDome.xnb - 13kb SkyDomeTexture.xnb - 4097 kb SkyDomeTexture_0.xnb - 683 kb So I guess my main project sucks there but I couldn't came with a solution. I even tried overwriting my game's content project with new game's content project (which was all okay) but it didn't work. Anybody had similar issues?

    Read the article

  • chown select files only

    - by user114642
    I use the (excellant) unison to sync two file servers and I've just realised i've synced a number of files without using the switch in unison that maintains the file user ownership. these files now have a user of root (coz i have to run unison as root) Can I chown to a specified user BUT only change the files that now have the owner root and do so recursively in the directory in question? Sure i can but not sure of the arguments to "find files with owner 0 and change them to owner xxxx". THX for any help...

    Read the article

  • Would md5 hashes allow detection of synced files?

    - by codpursue
    We have to develop our own file management system in Java web application. We need to sync files between our main server and client severs and find out whether all the client server has all the latest version of files. Our files are in pdf, doc and xls format they changes every now and then as and when it is required. What we are thinking of using MD5 checksum to find out hashcode of files on Main server and store it in database. Same would be there in Client Servers database. After comparing records on database we would come to know whether client servers are synced or not. Please suggest if there are any better ways to do the same.

    Read the article

  • Tomcat directly serve static (css, js) files shared by multiple applications

    - by Josvic Zammit
    I'm using the ExtJS framework which has a bulk of js and css files that are used for all apps. I intend to share these between a number of web applications (different war files). For this reason I would like to serve ExtJS js and css directly from the web server, in my case Tomcat6, which can be used to serve static files, as in this helpful link. Therefore I put my files under /var/lib/tomcat6/webapps/ROOT/extjs/. The static files that are directly under that directory are served correctly, e.g. /extjs/ext.js correctly serves the file at /var/lib/tomcat6/webapps/ROOT/extjs/ext.js. However files in lower-level directories, for example /extjs/welcome/css/welcome.css, which should serve the file at /var/lib/tomcat6/webapps/ROOT/extjs/welcome/css/welcome.css, return a 404. TL/DR Tomcat serves static files only at top-level directory. A 404 is returned for files deeper in the hierarchy. Config file contents: server.xml application's web.xml

    Read the article

  • Best Way for Developers to Upload Files to Production Server

    - by ultrajohn
    Small team of developers doing their work here and there. We have a team leader, and is sole responsible for uploading updated source files from the development server to the production server. So let's say, so if an updated files needs to be uploaded to the prod server, that concerned developer shall notify the team lead about it, and then the team lead will update the files to the prod server. So no developer has an access to the prod server except for the team lead. That's our current setup. Now, what we want to do is to give developers a way for uploading their updated files to the server without the team lead intervening in the process. What do you think is the best way to go about this?

    Read the article

  • Cloud just for hosting big files?

    - by yes123
    I need a solution to store my big files (50MB+ each). Currently I am using an european dedicated server (100MBits) with 8000GB/motnh at 60USD. I would like to use a cloud service that autmatically fetches my files from my server the first time users request it (like a classic cdn) (So I can have all files stored within 1 server) I was looking at Amazon CloudFront and, to get the same bandwidth 8'000 GB/month, I have to pay like 2000 USD vs my 60 USD of my dedicated server. Is there a cheaper alternative?

    Read the article

  • C++ Without Source Files

    - by Snowman
    Bjarne Stroustrup mentions in his book "The C++ Programming Language, 4th Edition" that not all C++ implementations use files to store and compile code: There are systems that do not store, compile, and present C++ programs to the programmer as sets of files. (Chapter 15, page 419) Later in the chapter, he reiterates that certain implementations do not use files but he does not give any examples. How would such an environment function compared to a more common file-based environment?

    Read the article

  • Git ignore deleted files

    - by Petah
    Ok heres my situation. I have a website project that has more than 50,000 unimportant files (to development) in some directories. /website.com/files/1.txt /website.com/files/2.txt /website.com/files/3.txt /website.com/files/etc.txt The stuff in /files is already in the repo. I want to delete all the files in /files on my local copy but I want git to ignore it so it doesn't delete them when I do a pull on the web server. Any ideas?

    Read the article

< Previous Page | 23 24 25 26 27 28 29 30 31 32 33 34  | Next Page >