Search Results

Search found 68296 results on 2732 pages for 'utl file'.

Page 72/2732 | < Previous Page | 68 69 70 71 72 73 74 75 76 77 78 79  | Next Page >

  • Github + keep file but dont track changes

    - by Mike
    I have a codeigniter framework thats using github. Within this application I have several files that i will want to have in the repo but not track any changes on.. Example is: i deploy a new installation of this framework to a new client, i want the following files to be downloaded (they have default values 'CHANGEME') and i just have to make changes specific to this client IE(database credentials, email address info, custom css styling). // the production config files i want the files but they need to be updated to specific client needs application/config/production/config.php application/config/production/database.php application/config/production/tank_auth.php // index page, defines the environment (production|development) /index.php // all of the css/js cache (keep the folder but not the contents) /assets/cache/* // production user based styling (color, fonts etc) needs to be updated specific to client needs /assets/frontend/css/user/frontend-user.css currently if i run git clone [email protected]:user123/myRepo.git httpdocs and then i edit the files above, all is great.. until i release a hotfix or patch and run git pull. All of my changes are then overwritten.

    Read the article

  • How to create batch file that delete all the folders named `bin` or `obj` recursively?

    - by Nam Gi VU
    I have the need of deleting all bin & obj folders in a folder on my PC. So, I'm thinking of a batch file to do that but I'm not famaliar with batching file in Windows. Please help. [Edit] After discussion with user DMA57361, I got to the current solution (still having problem though, see our comments): Create a .bat file, and paste the below command: start for /d /r . %%d in (bin,obj) do @if exist "%%d" rd /s/q "%%d" OR start for /d /r . %%d in (bin,obj) do @if exist "%%d" rd /s "%%d" @DMA57361: When I run your script, I get the below error. Any idea?

    Read the article

  • Maintaining file permissions across SVN updates?

    - by Mark Mayo
    I have a series of python scripts with execute permissions in Linux. They are stored in SVN. If I then run svn up to update them, the overwritten files are back to 644 - ie no execute permissions for anyone. Yes I could just script it to chmod +x * afterwards, but surely there's a way to store permissions in SVN or to maintain them when you update? Any suggestions appreciated.

    Read the article

  • File permissions question

    - by Camran
    If I need a php-page to have access to upload images and move images from a folder on the server, what permissions would I set on the folder then? I know we have "Owner", "Group", and "Others". But what does the server (or the php-code itself) count as? Owner? Also, would I need to set the php-page with the upload script with some specific permissions to be able to upload the files? Is it execute permissions? Thanks

    Read the article

  • Creating a file path in C#

    - by Jason
    So I'm trying to create a path in C#. I use Environment.Machinename and store it a variable serverName. Then I create another string variable and have some other path extension in there. Here is my code so far: string serverName = Environment.MachineName; string folderName = "\\AlarmLogger"; No matter what I do I can't seem to obtain only one backslash prior to AlarmLogger. Any ideas how I can specify a path in C#? Edit: I'm wondering if my code doesn't seem to want to paste correctly. Anyways when i paste it I only see one backslash but my code has two. Because of the escape character sequence. But something like string test = @"\" + serverName + folderName doesn't seem to want to work for me.

    Read the article

  • An XML file or Database?

    - by webnoob
    I am re-writing a section of my site and am trying to decide how much of a rewrite this will be. At the moment I have a web service feed that generates an xml once per day. I then use this xml file on my website to generate the general structure. I am trying to decide if this information should be located in the database or stay in the xml file. The file can range from 4mb - 12mb. The files depth can go on and on so I have to recurse to find the data I want. I use the .NET serializer classes and store the serialized file in a global variable to avoid re-serializing it each time the page is loaded. My reasons for thinking a database would be better are: I would know exactly where I am in the file by using an internal ID so I wouldn't have to recurse the file to get information. I wouldn't have to load / serialize the XML and could just use my already open database connections. Searching for the data in the file would be quicker(?) as I would just perform an SQL query rather than re-cursing the file. Has anyone got any ideas which is better and which option uses more resources on the server or be quicker? EDIT: The file is read every time the web page is loaded (although only serialized once). It isn't written to by standard users (only by an admin task that runs in the middle of the night). This is my initial investigation before mocking up.

    Read the article

  • Copying files to Truecrypt file container hangs

    - by Wagner Maestrelli
    I have a dual boot installation with Windows 7 Ultimate (32-bits, NTFS file sytem) and Ubuntu 10.10 (32-bits, ext4 file system). I have installed the version 7.0a of Truecrypt in both Operating Systems. Located in the Windows 7 HDD I have a 150 GB encrypted file container. It is a standard and dynamic file container, which means it's not hidden and uses a sparse file. This file was created using the Windows version of the Truecrypt program. When I logon in Windows the container is mounted as the drive E: and everything works fine! In Ubuntu the Windows's NTFS file system is automaticaly mounted after I logon. I've configured that using the ntfs-config package. In my ~/.profile I have this line to mount the truecrypt's file container: truecrypt /media/7EDEBCFADEBCABB1/Users/Wagner/hd/hd.tc /media/truecrypt1 The file container is mounted after the logon without any problem. I can access it, copy files to/from it, etc. But when I try do copy relatively large amounts of data (~50 MB) to it via nautilus or cp -R, it starts the copy, copies some data until certain point and then it just hangs! The progress bar does not move anymore and nothing happens. There is no error, it just hangs and that's it. I have to kill the process myself. This problem does not happen in Windows: I can copy very large amounts of data to the container and it works great. But in Ubuntu the problem always happens! I mean, whenever I try to copy a bunch of files together the copy process hangs. Does anyone ever faced this problem? Can anyone help? Thanks!

    Read the article

  • What is a widely accepted term for a string variable that would probably contain a file path and file name?

    - by Peter Turner
    For functions that need to index files in a directory and rename them FileName0001, FileName0002, etc... I often need to write a function that splits the file name from the file path and rename the file. When I put the file name and file path back together, I don't have a very good name for the variable that contains both of them and I usually just wind up concatenating them every time I want to use them (usually using them as parameters for functions labeled either filename or filepath) so I never really know what I'm doing until I notice a lot of files being written in the same directory as my binaries. Anyway, what do I call a file name and a file path? I don't want to call it File, because that usually means the binary information behind the file. I don't want to call it URI because that usually means I've got some sort of protocol, which I don't. I just want a good way to denote "c:\somedir\somedir\somedir\somefile.txt" so as to deconfuse this mess I've just realized I'm in. Please don't just list your personal preference. I think an excellent answer should "'site its sources". (as in, provide a link to a repository with a good example of the code being used as I described)

    Read the article

  • How do I prevent Excel from locking files by default?

    - by Andrzej Doyle
    When I double-click on a CSV file on a network share, the file is opened in Microsoft Excel (which is what I want). However, Excel assumes that I'm going to modify the file, and that everyone else is too, and so puts a lock on it. In practice I very rarely actually want to modify these files, merely read from them. And if I have the file open in an Excel window in the background, it stops anyone else from opening the same file. I am aware that I can manually open a file as read-only from the File - Open dialog within Excel. However I almost always open files by double-clicking on them in Explorer (or Outlook, for attachments). Is it possible to change the file association so that the default handler for CSV files is "Excel in read-only mode"? Is there a command-line argument that I can use in the Open With... dialog to achieve this? Or more bluntly - when I am looking at a CSV file in Windows Explorer, is there an easier way to open it read-only than starting up Excel myself, selecting File - Open, choosing "read only" from the dropdown, manually navigating to the same folder in the hierarchy, and then opening the file? (I am happy to have to jump through hoops on the rare occasions that I want to modify and save a file.)

    Read the article

  • SQL SERVER – Windows File/Folder and Share Permissions – Notes from the Field #029

    - by Pinal Dave
    [Note from Pinal]: This is a 29th episode of Notes from the Field series. Security is the task which we should give it to the experts. If there is a small overlook or misstep, there are good chances that security of the organization is compromised. This is very true, but there are always devils’s advocates who believe everyone should know the security. As a DBA and Administrator, I often see people not taking interest in the Windows Security hiding behind the reason of not expert of Windows Server. We all often miss the important mission statement for the success of any organization – Teamwork. In this blog post Brian tells the story in very interesting lucid language. Read On! In this episode of the Notes from the Field series database expert Brian Kelley explains a very crucial issue DBAs and Developer faces on their production server. Linchpin People are database coaches and wellness experts for a data driven world. Read the experience of Brian in his own words. When I talk security among database professionals, I find that most have at least a working knowledge of how to apply security within a database. When I talk with DBAs in particular, I find that most have at least a working knowledge of security at the server level if we’re speaking of SQL Server. One area I see continually that is weak is in the area of Windows file/folder (NTFS) and share permissions. The typical response is, “I’m a database developer and the Windows system administrator is responsible for that.” That may very well be true – the system administrator may have the primary responsibility and accountability for file/folder and share security for the server. However, if you’re involved in the typical activities surrounding databases and moving data around, you should know these permissions, too. Otherwise, you could be setting yourself up where someone is able to get to data he or she shouldn’t, or you could be opening the door where human error puts bad data in your production system. File/Folder Permission Basics: I wrote about file/folder permissions a few years ago to give the basic permissions that are most often seen. Here’s what you must know as a minimum at the file/folder level: Read - Allows you to read the contents of the file or folder. Having read permissions allows you to copy the file or folder. Write  – Again, as the name implies, it allows you to write to the file or folder. This doesn’t include the ability to delete, however, nothing stops a person with this access from writing an empty file. Delete - Allows the file/folder to be deleted. If you overwrite files, you may need this permission. Modify - Allows read, write, and delete. Full Control - Same as modify + the ability to assign permissions. File/Folder permissions aggregate, unless there is a DENY (where it trumps, just like within SQL Server), meaning if a person is in one group that gives Read and antoher group that gives Write, that person has both Read and Write permissions. As you might expect me to say, always apply the Principle of Least Privilege. This likely means that any additional permission you might add does not need Full Control. Share Permission Basics: At the share level, here are the permissions. Read - Allows you to read the contents on the share. Change - Allows you to read, write, and delete contents on the share. Full control - Change + the ability to modify permissions. Like with file/folder permissions, these permissions aggregate, and DENY trumps. So What Access Does a Person / Process Have? Figuring out what someone or some process has depends on how the location is being accessed: Access comes through the share (\\ServerName\Share) – a combination of permissions is considered. Access is through a drive letter (C:\, E:\, S:\, etc.) – only the file/folder permissions are considered. The only complicated one here is access through the share. Here’s what Windows does: Figures out what the aggregated permissions are at the file/folder level. Figures out what the aggregated permissions are at the share level. Takes the most restrictive of the two sets of permissions. You can test this by granting Full Control over a folder (this is likely already in place for the Users local group) and then setting up a share. Give only Read access through the share, and that includes to Administrators (if you’re creating a share, likely you have membership in the Administrators group). Try to read a file through the share. Now try to modify it. The most restrictive permission is the Share level permissions. It’s set to only allow Read. Therefore, if you come through the share, it’s the most restrictive. Does This Knowledge Really Help Me? In my experience, it does. I’ve seen cases where sensitive files were accessible by every authenticated user through a share. Auditors, as you might expect, have a real problem with that. I’ve also seen cases where files to be imported as part of the nightly processing were overwritten by files intended from development. And I’ve seen cases where a process can’t get to the files it needs for a process because someone changed the permissions. If you know file/folder and share permissions, you can spot and correct these types of security flaws. Given that there are a lot of database professionals that don’t understand these permissions, if you know it, you set yourself apart. And if you’re able to help on critical processes, you begin to set yourself up as a linchpin (link to .pdf) for your organization. If you want to get started with performance tuning and database security with the help of experts, read more over at Fix Your SQL Server. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: Notes from the Field, PostADay, SQL, SQL Authority, SQL Query, SQL Security, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • Using the HTML5 &lt;input type=&quot;file&quot; multiple=&quot;multiple&quot;&gt; Tag in ASP.NET

    - by Rick Strahl
    Per HTML5 spec the <input type="file" /> tag allows for multiple files to be picked from a single File upload button. This is actually a very subtle change that's very useful as it makes it much easier to send multiple files to the server without using complex uploader controls. Please understand though, that even though you can send multiple files using the <input type="file" /> tag, the process of how those files are sent hasn't really changed - there's still no progress information or other hooks that allow you to automatically make for a nicer upload experience without additional libraries or code. For that you will still need some sort of library (I'll post an example in my next blog post using plUpload). All the new features allow for is to make it easier to select multiple images from disk in one operation. Where you might have required many file upload controls before to upload several files, one File control can potentially do the job. How it works To create a file input box that allows with multiple file support you can simply do:<form method="post" enctype="multipart/form-data"> <label>Upload Images:</label> <input type="file" multiple="multiple" name="File1" id="File1" accept="image/*" /> <hr /> <input type="submit" id="btnUpload" value="Upload Images" /> </form> Now when the file open dialog pops up - depending on the browser and whether the browser supports it - you can pick multiple files. Here I'm using Firefox using the thumbnail preview I can easily pick images to upload on a form: Note that I can select multiple images in the dialog all of which get stored in the file textbox. The UI for this can be different in some browsers. For example Chrome displays 3 files selected as text next to the Browse… button when I choose three rather than showing any files in the textbox. Most other browsers display the standard file input box and display the multiple filenames as a comma delimited list in the textbox. Note that you can also specify the accept attribute in the <input> tag, which specifies a mime-type to specify what type of content to allow.Here I'm only allowing images (image/*) and the browser complies by just showing me image files to display. Likewise I could use text/* for all text formats registered on the machine or text/xml to only show XML files (which would include xml,xst,xsd etc.). Capturing Files on the Server with ASP.NET When you upload files to an ASP.NET server there are a couple of things to be aware of. When multiple files are uploaded from a single file control, they are assigned the same name. In other words if I select 3 files to upload on the File1 control shown above I get three file form variables named File1. This means I can't easily retrieve files by their name:HttpPostedFileBase file = Request.Files["File1"]; because there will be multiple files for a given name. The above only selects the first file. Instead you can only reliably retrieve files by their index. Below is an example I use in app to capture a number of images uploaded and store them into a database using a business object and EF 4.2.for (int i = 0; i < Request.Files.Count; i++) { HttpPostedFileBase file = Request.Files[i]; if (file.ContentLength == 0) continue; if (file.ContentLength > App.Configuration.MaxImageUploadSize) { ErrorDisplay.ShowError("File " + file.FileName + " is too large. Max upload size is: " + App.Configuration.MaxImageUploadSize); return View("UploadClassic",model); } var image = new ClassifiedsBusiness.Image(); var ms = new MemoryStream(16498); file.InputStream.CopyTo(ms); image.Entered = DateTime.Now; image.EntryId = model.Entry.Id; image.ContentType = "image/jpeg"; image.ImageData = ms.ToArray(); ms.Seek(0, SeekOrigin.Begin); // resize image if necessary and turn into jpeg Bitmap bmp = Imaging.ResizeImage(ms.ToArray(), App.Configuration.MaxImageWidth, App.Configuration.MaxImageHeight); ms.Close(); ms = new MemoryStream(); bmp.Save(ms,ImageFormat.Jpeg); image.ImageData = ms.ToArray(); bmp.Dispose(); ms.Close(); model.Entry.Images.Add(image); } This works great and also allows you to capture input from multiple input controls if you are dealing with browsers that don't support multiple file selections in the file upload control. The important thing here is that I iterate over the files by index, rather than using a foreach loop over the Request.Files collection. The files collection returns key name strings, rather than the actual files (who thought that was good idea at Microsoft?), and so that isn't going to work since you end up getting multiple keys with the same name. Instead a plain for loop has to be used to loop over all files. Another Option in ASP.NET MVC If you're using ASP.NET MVC you can use the code above as well, but you have yet another option to capture multiple uploaded files by using a parameter for your post action method.public ActionResult Save(HttpPostedFileBase[] file1) { foreach (var file in file1) { if (file.ContentLength < 0) continue; // do something with the file }} Note that in order for this to work you have to specify each posted file variable individually in the parameter list. This works great if you have a single file upload to deal with. You can also pass this in addition to your main model to separate out a ViewModel and a set of uploaded files:public ActionResult Edit(EntryViewModel model,HttpPostedFileBase[] uploadedFile) You can also make the uploaded files part of the ViewModel itself - just make sure you use the appropriate naming for the variable name in the HTML document (since there's Html.FileFor() extension). Browser Support You knew this was coming, right? The feature is really nice, but unfortunately not supported universally yet. Once again Internet Explorer is the problem: No shipping version of Internet Explorer supports multiple file uploads. IE10 supposedly will, but even IE9 does not. All other major browsers - Chrome, Firefox, Safari and Opera - support multi-file uploads in their latest versions. So how can you handle this? If you need to provide multiple file uploads you can simply add multiple file selection boxes and let people either select multiple files with a single upload file box or use multiples. Alternately you can do some browser detection and if IE is used simply show the extra file upload boxes. It's not ideal, but either one of these approaches makes life easier for folks that use a decent browser and leaves you with a functional interface for those that don't. Here's a UI I recently built as an alternate uploader with multiple file upload buttons: I say this is my 'alternate' uploader - for my primary uploader I continue to use an add-in solution. Specifically I use plUpload and I'll discuss how that's implemented in my next post. Although I think that plUpload (and many of the other packaged JavaScript upload solutions) are a better choice especially for large uploads, for simple one file uploads input boxes work well enough. The advantage of this solution is that it's very easy to handle on the server side. Any of the JavaScript controls require special handling for uploads which I'll also discuss in my next post.© Rick Strahl, West Wind Technologies, 2005-2012Posted in HTML5  ASP.NET  MVC   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • Making file in user's homedir accessible from web/webserver

    - by evident
    Hi everybody, I have a txt-file one of my user's homedir which is regularly updated there by a script. I now want to be able to access (read) this file from the web. /home/user/folder/file.txt So what I tried now is to log in as root, go into my webservers httpdocs folder /var/www/path/to/domain/httpdocs and there I tried to create a symbolic link with ln -s /home/user/foler/file.txt /var/www/path/to/domain/httpdocs/file.txt But this didn't work... I already tried changing the chmod of the symlink (which changes the ones from the original file of course) and also a chown to the user from webserver, but no matter what I tried I cannot open the file from the web or from a php-script (which is what I want to do) Can anybody help me and tell me what I need to do? What rights do I need to give? Or is there another way of achieving this?

    Read the article

  • Uploading a file automatically for speed test?

    - by Abhi
    I am building a Web UI for a device for internet connection and one of the requirements in it is a speed test. I know the basic concept of how speed test works. A file is downloaded for a limited time then the same file is uploaded again and the speed is tracked at regular intervals. Downloading the file is not an issue, but how am I supposed to upload the file without the client knowing that the file is getting uploaded? I've read through a lot of documentation, but I'm still not able to get the answer to how I will upload the file from clients machine without asking him to select the file.

    Read the article

  • Edit 100MB+ file

    - by Majid Fouladpour
    I have captured some traffic with Wireshark and saved the result as a file. The file has 3 sections now: request headers response headers response body The response body is to become an flv file, but now everything is saved as a single file. So I need a way to delete the first two sections from the file, but the problem is that the file is very big (over a thousand mega bytes). I have tried to open it with gedit, but no matter how long I wait, gedit hangs and remains unresponsive until I kill it. What tool can I use to edit this big file easily?

    Read the article

  • File Upload with HttpWebRequest doesn't post the file

    - by Sri Kumar
    Hello All, Here is my code to post the file. I use asp fileupload control to get the file stream. HttpWebRequest requestToSender = (HttpWebRequest)WebRequest.Create("http://localhost:2518/Web/CrossPage.aspx"); requestToSender.Method = "POST"; requestToSender.ContentType = "multipart/form-data"; requestToSender.KeepAlive = true; requestToSender.Credentials = System.Net.CredentialCache.DefaultCredentials; requestToSender.ContentLength = BtnUpload.PostedFile.ContentLength; BinaryReader binaryReader = new BinaryReader(BtnUpload.PostedFile.InputStream); byte[] binData = binaryReader.ReadBytes(BtnUpload.PostedFile.ContentLength); Stream requestStream = requestToSender.GetRequestStream(); requestStream.Write(binData, 0, binData.Length); requestStream.Close(); HttpWebResponse responseFromSender = (HttpWebResponse)requestToSender.GetResponse(); string fromSender = string.Empty; using (StreamReader responseReader = new StreamReader(responseFromSender.GetResponseStream())) { fromSender = responseReader.ReadToEnd(); } XMLString.Text = fromSender; In the page load of CrossPage.aspx i have the following code NameValueCollection postPageCollection = Request.Form; foreach (string name in postPageCollection.AllKeys) { Response.Write(name + " " + postPageCollection[name]); } HttpFileCollection postCollection = Request.Files; foreach (string name in postCollection.AllKeys) { HttpPostedFile aFile = postCollection[name]; aFile.SaveAs(Server.MapPath(".") + "/" + Path.GetFileName(aFile.FileName)); } string strxml = "sample"; Response.Clear(); Response.Write(strxml); I don't get the file in Request.Files. The byte array is created. What was wrong with my HttpWebRequest?

    Read the article

  • Trouble converting an MP3 file to a WAV file using Naudio

    - by WebDevHobo
    Naudio Library: http://naudio.codeplex.com/ I'm trying to convert an MP3 file to a WAV file, but I've run in to a small error. I know what's going wrong, but I don't really know how to go about fixing it. Here's the piece of code I'm running: private void button1_Click(object sender, EventArgs e) { using(Mp3FileReader reader = new Mp3FileReader(@"path\to\MP3")) { using(WaveFileWriter writer = new WaveFileWriter(@"C:\test.wav", new WaveFormat())) { int counter = 0; while(reader.Read(test, counter, test.Length + counter) != 0) { writer.WriteData(test, counter, test.Length + counter); counter += 512; } } } } reader.Read() goes into the Mp3FileReader class, and the method looks like this: public override int Read(byte[] sampleBuffer, int offset, int numBytes) { if (numBytes % waveFormat.BlockAlign != 0) //throw new ApplicationException("Must read complete blocks"); numBytes -= (numBytes % waveFormat.BlockAlign); return mp3Stream.Read(sampleBuffer, offset, numBytes); } mp3Stream is an object of the Stream class. The problem is: I'm getting an ArgumentException. MSDN says that this is because the sum of offset and numBytes is greater than the length of sampleBuffer. Documentation: http://msdn.microsoft.com/en-us/library/system.io.stream.read.aspx This happens because I increase the counter every time, but the size of the byte array test remains the same. What I've been wondering is: do I need to increase the size of the array dynamically, or do I need to find out the needed size at the beginning and set it right away? And also, instead of 512, the method in Mp3FileReader returns 365 the first time. Which is the size of a whole block. But I'm writing the full 512. I'm basically just using the read to check if I'm not at the end of the file yet. Do I need to catch the return value and do something with that, or am I good here?

    Read the article

  • IE7 issue - cannot download streamed file when Automatic prompting for file downloads is disabled

    - by Jai ganesh K
    Hi, My application is J2EE (JSP/Servlet) based. I encounter an issue when i try to open a new window (pop-up) from JSP and call a Servlet action (e.g. Streamer.do) which streams a PDF file inside that pop-up. Problem: While IE 7 - Tools - Internet Options - Security - Custom Level - Downloads - Automatic prompting for file downloads is Disabled and while pop-up window get opened, I am unable to download the file (Save/Open prompt is not comming up). In contrast, when I enable this option, I am able to download. But this option sometimes would be disabled in some environments. While testing this in Mozilla Firefox 3.0/3/5/IE6 it is working fine without any settings change. When i check it to enable i then get the Save/Open prompt to work correctly. This should be problem with IE7. Can anybody help us with Javascript or any working settings which doesnt care whether the "Automatic prompting for downloads" option in IE7 is enabled. Any help in this would be much appreciated. Regards! Jai

    Read the article

  • Pentaho - Reporting Tool - Is the .prpt file (report template file) contains datasource information

    - by Yatendra Goel
    I am new Pentaho Reporting Tool. I have the following question: When I created a report using Pentaho Report Designer, it output a report file having .prpt extension. After that I found an example on internet where the following code were used to display the report in html format:| protected void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException { ResourceManager manager = new ResourceManager(); manager.registerDefaults(); String reportPath = "file:" + this.getServletContext().getRealPath("sampleReport.prpt"); try { Resource res = manager.createDirectly(new URL(reportPath), MasterReport.class); MasterReport report = (MasterReport) res.getResource(); HtmlReportUtil.createStreamHTML(report, response.getOutputStream()); } catch (Exception e) { e.printStackTrace(); } } And the report got printed successfully. So as we haven't specified any datasource information here, I think that the .prpt file contains that information in it. If that's true than Isn't Jasper is better Reporting tool than Pentaho because when we display Jasper reports, we have to provide datasource details also so in that way our report is flexible and is not bound to any particular database.

    Read the article

  • Django/jQuery - read file and pass to browser as file download prompt

    - by danspants
    I've previously asked a question regarding passing files to the browser so a user receives a download prompt. However these files were really just strings creatd at the end of a function and it was simple to pass them to an iframe's src attribute for the desired effect. Now I have a more ambitious requirement, I need to pass pre existing files of any format to the browser. I have attempted this using the following code: def return_file(request): try: bob=open(urllib.unquote(request.POST["file"]),"rb") response=HttpResponse(content=bob,mimetype="application/x-unknown") response["Content-Disposition"] = "attachment; filename=nothing.xls" return HttpResponse(response) except: return HttpResponse(sys.exc_info()) With my original setup the following jQuery was sufficient to give the desired download prompt: jQuery('#download').attr("src","/return_file/"); However this won't work anymore as I need to pass POST values to the function. my attempt to rectify that is below, but instead of a download prompt I get the file displayed as text. jQuery.get("/return_file/",{"file":"c:/filename.xls"},function(data) { jQuery(thisButton).children("iframe").attr("src",data); }); Any ideas as to where I'm going wrong? Thanks!

    Read the article

  • Need a way to determine if a file is done being written to.

    - by Khorkrak
    The situation I'm in is this - there's a process that's writing to a file, sometimes the file is rather large say 400 - 500MB. I need to know when it's done writing. How can I determine this? If I look in the directory I'll see it there but it might not be done being written. Plus this needs to be done remotely - as in on the same internal LAN but not on the same computer and typically the process that wants to know when the file writing is done is running on a Linux box with a the process that's writing the file and the file itself on a windows box. No samba isn't an option. xmlrpc communication to a service on that windows box is an option as well as using snmp to check if that's viable. Ideally Works on either Linux or Windows - meaning the solution is OS independent. Works for any type of file. Good enough: Works just on windows but can be done through some library or whatever that can be accessed with Python. Works only for PDF files. Current best idea is to periodically open the file in question from some process on the windows box and look at the last bytes checking for the PDF end tag and accounting for the eol differences because the file may have been created on Linux or Windows.

    Read the article

  • Deploy an hta file in an exe or msi file

    - by rjmunro
    I've written an HTA file frontend for our web app that allows the web app to run without web browser status bars etc, and allows it to access the local system for certain tasks. I need a way to deploy this to customers. I need an installer to supply an hta file, an ico file, and add a link to them in the start menu and on the users desktop. I looked at building an installer with NSIS, but I couldn't figure out how to assign the icon to the shortcuts - The icon had to be a standard HTA one. Can this be done with NSIS, or should I be using another installer? P.s. I've got no particular preference for NSIS, it's just something I once used a very long time ago. When I download stuff, I think I prefer them to be msi files that launch with windows installer (it feels more like downloading a .rpm or .deb on linux which I am used to) but I know nothing about how those are created. I'm a web/linux guy who knows very little about windows programming.

    Read the article

  • Reading a binary file in perl: Bad File Descriptor

    - by Magicked
    I'm trying to read a binary file 40 bytes at a time, then check to see if all those bytes are 0x00, and if so ignore them. If not, it will write them back out to another file (basically just cutting out large blocks of null bytes). This may not be the most efficient way to do this, but I'm not worried about that. However, right now I'm getting a "Bad File Descriptor" error and I cannot figure out why. my $comp = "\x00" * 40; my $byte_count = 0; my $infile = "/home/magicked/image1"; my $outfile = "/home/magicked/image1_short"; open IN, "<$infile"; open OUT, ">$outfile"; binmode IN; binmode OUT; my ($buf, $data, $n); while (read (IN, $buf, 40)) { ### Problem is here ### $boo = 1; for ($i = 0; $i < 40; $i++) { if ($comp[$i] != $buf[$i]) { $i = 40; print OUT $buf; $byte_count += 40; } } } die "Problems! $!\n" if $!; close OUT; close IN; I marked with a comment where it is breaking. Thanks for any help!

    Read the article

  • Saving temporary file name after uploading file with uplodify

    - by mIRU
    I have this script if (!empty($_FILES)) { $tempFile = $_FILES['Filedata']['tmp_name']; $targetPath = dirname(__FILE__).$_POST['folder'] . '/'; $pathinfoFile = pathinfo($_FILES['Filedata']['name']); $targetFile = str_replace('//','/',$targetPath) .uniqid().'.'.$pathinfoFile['extension']; move_uploaded_file($tempFile,$targetFile); } this script is from uplodify , modified for saving the file with unique name , i need after user will upload the file , this unique name and original name , to save temporary , when user will submit the form , this temporary variables i will insert in database , the problem is that in firebug console , i can not see all the actions of this script , and i can not understand the way how to fix it, I tried to save in $_SESSION but i have problems , is not saving , i found why is not working with $_SESSION , http://uploadify.com/forum/viewtopic.php?f=5&t=43 , i tried the solution from forum but without result , exist a more easy way to solve this problem ? , or what is better way to do it ? . Sorry for so silly question , i ran out of ideas . Thanks a lot for helping me , and sorry again for my English

    Read the article

  • Java - Reading a csv file line by line - stuck with weird non-existent characters being read!

    - by rockit
    hello fellow java developers. I'm having a very strange issue. I'm trying to read a csv file line by line. Im at the point where Im just testing out the reading of the lines. ONly each time that I read a line, the line contains square characters between each character of text. I even saved the file as a txt file in wordpad and notepad with no change. Thus I must be doing something stupid... I have a csv file, standard csv file, yes a text file with commas in it. I try to read a line of text, but the text is all f-ed up and cannot find the phrase within the text. Any advice? code below. //open csv File filReadMe = new File(strRoot + "data2.csv"); BufferedReader brReadMe = new BufferedReader(new InputStreamReader(new FileInputStream(filReadMe))); String strLine = brReadMe.readLine(); //for all lines while (strLine != null){ //if line contains "(see also" if (strLine.toLowerCase().contains("(see also")){ //write line from "(see also" to ")" int iBegin = strLine.toLowerCase().indexOf("(see also"); String strTemp = strLine.substring(iBegin); int iLittleEnd = strTemp.indexOf(")"); System.out.println(strLine.substring(iBegin, iBegin + iLittleEnd)); } //update line strLine = brReadMe.readLine(); } //end for brReadMe.close();

    Read the article

< Previous Page | 68 69 70 71 72 73 74 75 76 77 78 79  | Next Page >