Search Results

Search found 5304 results on 213 pages for 'audio streaming'.

Page 66/213 | < Previous Page | 62 63 64 65 66 67 68 69 70 71 72 73  | Next Page >

  • MPMoviePlayerController - streaming works on 3GS, not on anything pre-3GS

    - by Canada Dev
    I am having some serious issues and annoyances with MPMoviePlayerController. In my app you can watch trailers for some movies in .mov format. I have tested with a friend and had users report that it does not work on their device, which are all 3G. I have tested on my own, a 3GS and playback works fine. I have tried on a 1st gen iPhone and it doesn't work. So I am lead to believe it's a memory issue, and that it's simply stopping the playback and returning to the previous screen. Below is the code I use to launch the player, which is straight out of the MoviePlayer example from Apple. MPMoviePlayerController *mp = [[MPMoviePlayerController alloc] initWithContentURL:[NSURL URLWithString:trailerURL]]; if (mp) { self.moviePlayer = mp; [mp release]; [self.moviePlayer play]; } I have tried to check the NSError from the notifications, but the only thing I get is "An unknown playback error occurred" for both the localizedDescription and localizedRecoverySuggestion, making it impossible to figure out exactly why it's not working. I have seen many examples of people who just have issues with the movie player, but it's starting to annoy me that it sometimes seem to work fine and other times it just doesn't (again, appearing like a memory issue). Thanks for any help/feedback provided

    Read the article

  • Is it possible to have AVFramework and AudioToolbox framework in one app?

    - by Satyam
    I'm trying to write develop audio related application. In that, I'm using AudioToolBox framework for recording the sound. And I'm using AVFramework to play soudns. When app is stared, it will play some mp3 file using AVFramework. And also initializes Audiotoolbox. In simulator, I'm able to record and play. But when I'm testing it on iPhone, I'm getting following error for initializing AudioToolBox. 2009-12-11 22:25:51.599 StoryBook[807:207] AudioRecorder init AudioSessionInitialize failed with error: 1768843636 Can some one tell me whether we can use both AV as well as Audio Toolbox frame works in one application? Why I'm getting that error?

    Read the article

  • Changing volume in Java when using JLayer.

    - by Penchant
    I'm using JLayer to play an inputstream of mp3 data from the internet. How do i change the volume of the output? I'm using this code to play it: URL u = new URL(s); URLConnection conn = u.openConnection(); conn.setConnectTimeout(Searcher.timeoutms); conn.setReadTimeout(Searcher.timeoutms); bitstream = new Bitstream(conn.getInputStream()/*new FileInputStream(quick_file)*/); System.out.println(bitstream); decoder = new Decoder(); decoder.setEqualizer(equalizer); audio = FactoryRegistry.systemRegistry().createAudioDevice(); audio.open(decoder); for(int i = quick_positions[0]; i > 0; i--){ Header h = bitstream.readFrame(); if (h == null){ return; } bitstream.closeFrame();

    Read the article

  • Web P2P video confrence solution

    - by dtroy
    I'm looking for the best possible solution which will allow me to incorporate live video/audio conference between 2 users(only 2 at this point) into a flash gaming platform. The video chat is not just an extra feature, it's the main one. I'm mainly looking at open source implementations or something I'll be able to implement myself, but will consider commercial products if they are exactly what I need. Here are a few things I've looked at, but so far, I didn't find any of them good enough: Flash player 10's P2P capabilities sound promising, but I am aware of the fact that Adobe has not release any information on the RTMFP protocol and that there is no commercial server which supports it at this point. Stream all the video/audio live through a flash server (not p2p), but from my personal experience you don't get a smooth conversation. I think TokBox uses this method Java applets are a possible solution too (to perform p2p), but I don't think it will be a nice and elegant solution to combine them in the game at this point (and requires the user to authorize them). BTW, I couldn't find any useful implementations. So, If you know of any, i'll look into them. Google Gmail Video Chat uses a custom (and proprietary) browser plug-in which does the p2p and streams the video/audio into the flash player. This is a possible solution, but I rather not implement the entire p2p protocol stack + browser plug-in at this stage and concentrate on other aspect of the game itself. I think they are using XMPP based protocol similar to Jingle and they've release a Jingle librarby but without the video confrencing implementation. EDIT: In response to Branden: I am aware of Adobe Stratus. Stratus is a beta, hosted rendezvous service that aids establishing communications between Flash Player endpoints (RTMFP server). This current release of the Stratus is prerelease and is designed for evaluation purposes only. The service is not final. There is no guarantee that the service will continue to exist in the future or any information about the future cost. That's why I don't think it can be used as a commercial solution. At least not yet. I'd appreciate your suggestions and advice. thanks!

    Read the article

  • PHP Streaming CSV always adds UTF-8 BOM

    - by Mustafa Ashurex
    The following code gets a 'report line' as an array and uses fputcsv to tranform it into CSV. Everything is working great except for the fact that regardless of the charset I use, it is putting a UTF-8 bom at the beginning of the file. This is exceptionally annoying because A) I am specifying iso and B) We have lots of users using tools that show the UTF-8 bom as characters of garbage. I have even tried writing the results to a string, stripping the UTF-8 BOM and then echo'ing it out and still get it. Is it possible that the issue resides with Apache? If I change the fopen to a local file it writes it just fine without the UTF-8 BOM. header("Content-type: text/csv; charset=iso-8859-1"); header("Cache-Control: no-store, no-cache"); header("Content-Disposition: attachment; filename=\"report.csv\""); $outstream = fopen("php://output",'w'); for($i = 0; $i < $report-rowCount; $i++) { fputcsv($outstream, $report-getTaxMatrixLineValues($i), ',', '"'); } fclose($outstream); exit;

    Read the article

  • Servlet IOException when streaming a Byte Array

    - by MontyBongo
    I occasionally get an IOException from a Servlet I have that writes a byte array to the outputstream in order to provide a file download capability. This download servlet has a reasonable amount of traffic (100k hits a month) and this exception occurs rarely, around 1-2 times a month. I have tried recreating the exception by using the exact same Base64 String and no exception is raised and the Servlet behaves as designed. Is this IO Exception being caused by something outside my applications control? For example a network issue or the user resetting the connection? I have tried to google the cause of an IOException at this point in the stack but to no avail. The environment is running Tomcat 5.5 on CentOS 5.3 with an Apache HTTP Server acting as a proxy using proxy_ajp. ClientAbortException: java.io.IOException at org.apache.catalina.connector.OutputBuffer.realWriteBytes(OutputBufferjava:366) at org.apache.tomcat.util.buf.ByteChunk.append(ByteChunk.java:352) at org.apache.catalina.connector.OutputBuffer.writeBytes(OutputBuffer.java:392) at org.apache.catalina.connector.OutputBuffer.write(OutputBuffer.java:381) at org.apache.catalina.connector.CoyoteOutputStream.write(CoyoteOutputStream.java:89) at org.apache.catalina.connector.CoyoteOutputStream.write(CoyoteOutputStream.java:83) at com.myApp.Download.doPost(Download.java:34) at javax.servlet.http.HttpServlet.service(HttpServlet.java:710) at javax.servlet.http.HttpServlet.service(HttpServlet.java:803) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:269) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:188) at org.apache.catalina.core.ApplicationDispatcher.invoke(ApplicationDispatcher.java:691) at org.apache.catalina.core.ApplicationDispatcher.processRequest(ApplicationDispatcher.java:469) at org.apache.catalina.core.ApplicationDispatcher.doForward(ApplicationDispatcher.java:403) at org.apache.catalina.core.ApplicationDispatcher.forward(ApplicationDispatcher.java:301) at com.myApp.EntryServlet.service(EntryServlet.java:278) at javax.servlet.http.HttpServlet.service(HttpServlet.java:803) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:269) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:188) at com.myApp.filters.RequestFilter.doFilter(RequestFilter.java:16) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:215) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:188) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:210) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:172) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:117) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:108) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:151) at org.apache.coyote.ajp.AjpAprProcessor.process(AjpAprProcessor.java:444) at org.apache.coyote.ajp.AjpAprProtocol$AjpConnectionHandler.process(AjpAprProtocol.java:472) at org.apache.tomcat.util.net.AprEndpoint$Worker.run(AprEndpoint.java:1286) at java.lang.Thread.run(Thread.java:636) Caused by: java.io.IOException at org.apache.coyote.ajp.AjpAprProcessor.flush(AjpAprProcessor.java:1200) at org.apache.coyote.ajp.AjpAprProcessor$SocketOutputBuffer.doWrite(AjpAprProcessor.java:1285) at org.apache.coyote.Response.doWrite(Response.java:560) at org.apache.catalina.connector.OutputBuffer.realWriteBytes(OutputBufferjava:361) And the code in the Download Servlet: @Override protected void doPost (HttpServletRequest request, HttpServletResponse response) throws ServletException, java.io.IOException { try { response.setContentType("application/pdf"); response.setHeader("Pragma", ""); response.setHeader("Cache-Control", ""); response.setHeader("Content-Disposition", "Inline; Filename=myPDFFile..pdf"); ServletOutputStream out = response.getOutputStream(); byte[] downloadBytes = Base64.decode((String)request.getAttribute("fileToDownloadBase64")); out.write(downloadBytes); } catch (Base64DecodingException e) { e.printStackTrace(); response.getOutputStream().print("An error occurred"); } }

    Read the article

  • Play a beep that loop and change the frequency/speed

    - by Bono
    Hi all, I am creating an iphone application that use audio. I want to play a beep sound that loop indefinitely. I found an easy way to do that using the upper layer AVAudioPlayer and the numberOfLoops set to "-1". It works fine. But now I want to play this audio and be able to change the rate / speed. It may works like the sound played by a car when approaching an obstacle. At the beginning the beep has a low frequency and this frequency accelerate till reaching a continuous sound biiiiiiiiiiiip ... It seems this is not feasible using the high layer AVAudioPlayer, but even looking at AudioToolBox I found no solution. Does anybody have informations about how to do that? Thanks a lot for helping me!

    Read the article

  • Novocaine - How to loop file playback? (iOS)

    - by lppier
    I'm using Novocaine by alexbw Novocaine for my audio project. I'm playing around with the example code here for file reading. The file plays back with no problem. I would like to loop this recording with the gap between the loops - any suggestion as to how I can do so? Thanks. Pier. // AUDIO FILE READING OHHH YEAHHHH // ======================================== NSArray *pathComponents = [NSArray arrayWithObjects: [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject], @"testrecording.wav", nil]; NSURL *inputFileURL = [NSURL fileURLWithPathComponents:pathComponents]; NSLog(@"URL: %@", inputFileURL); fileReader = [[AudioFileReader alloc] initWithAudioFileURL:inputFileURL samplingRate:audioManager.samplingRate numChannels:audioManager.numOutputChannels]; [fileReader play]; [fileReader setCurrentTime:0.0]; //float duration = fileReader.getDuration; [audioManager setOutputBlock:^(float *data, UInt32 numFrames, UInt32 numChannels) { [fileReader retrieveFreshAudio:data numFrames:numFrames numChannels:numChannels]; NSLog(@"Time: %f", [fileReader getCurrentTime]); }];

    Read the article

  • Asynchronous Silverlight 4 call to the World of Warcraft armoury streaming XML in C#

    - by user348446
    Hello - I have been stuck on this all weekend and failed miserably! Please help me to claw back my sanity!! Your challenge For my first Silverlight application I thought it would be fun to use the World of Warcraft armoury to list the characters in my guild. This involves making an asyncronous from Silverlight (duh!) to the WoW armoury which is XML based. SIMPLE EH? Take a look at this link and open the source. You'll see what I mean: http://eu.wowarmory.com/guild-info.xml?r=Eonar&n=Gifted and Talented Below is code for getting the XML (the call to ShowGuildies will cope with the returned XML - I have tested this locally and I know it works). I have not managed to get the expected returned XML at all. Notes: If the browser is capable of transforming the XML it will do so, otherwise HTML will be provided. I think it examines the UserAgent I am a seasoned asp.net web developer C# so go easy if you start talking about native to Windows Forms / WPF I can't seem to set the UserAgent setting in .net 4.0 - doesn't seem to be a property off the HttpWebRequest object for some reason - i think it used to be available. Silverlight 4.0 (created as 3.0 originally before I updated my installation of Silverlight to 4.0) Created using C# 4.0 Please explain as if you talking to a web developer and not a proper programming lol! Below is the code - it should return the XML from the wow armoury. private void button7_Click(object sender, RoutedEventArgs e) { // URL for armoury lookup string url = @"http://eu.wowarmory.com/guild-info.xml?r=Eonar&n=Gifted and Talented"; // Create the web request HttpWebRequest httpWebRequest = (HttpWebRequest)WebRequest.Create(url); // Set the user agent so we are returned XML and not HTML //httpWebRequest.Headers["User-Agent"] = "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.0)"; // Not sure about this dispatcher thing - it's late so i have started to guess. Dispatcher.BeginInvoke(delegate() { // Call asyncronously IAsyncResult asyncResult = httpWebRequest.BeginGetResponse(ReqCallback, httpWebRequest); // End the response and use the result using (HttpWebResponse httpWebResponse = (HttpWebResponse)httpWebRequest.EndGetResponse(asyncResult)) { // Load an XML document from a stream XDocument x = XDocument.Load(httpWebResponse.GetResponseStream()); // Basic function that will use LINQ to XML to get the list of characters. ShowGuildies(x); } }); } private void ReqCallback(IAsyncResult asynchronousResult) { // Not sure what to do here - maybe update the interface? } Really hope someone out there can help me! Thanks mucho! Dan. PS Yes, I have noticed the irony in the name of the guild :)

    Read the article

  • Downloading a file in ASP.NET (through the server) while streaming it to the user

    - by James Teare
    My ASP.NET website currently downloads a file from a remote server to a local drive, although when users access the site they have to wait for the server to finish downloading the file until they can then download the file from my ASP.NET website. Is it possible to almost stream the download from the remote website - through my ASP.NET website directly to the user (a bit like a proxy) ? My current code is as follows: using (var client = new WebClientEx()) { client.DownloadFile(downloadURL, "outputfile.zip"); } WebClient class: public class WebClientEx : WebClient { public CookieContainer CookieContainer { get; private set; } public WebClientEx() { CookieContainer = new CookieContainer(); } protected override WebRequest GetWebRequest(Uri address) { var request = base.GetWebRequest(address); if (request is HttpWebRequest) { (request as HttpWebRequest).CookieContainer = CookieContainer; } return request; } }

    Read the article

  • How Do I Convert text to a WAV file With Inaudible Waveform?

    - by Scott
    I am trying to create an audio watermarking system. I figure the best solution is to create an audio file (WAV) based on a unique string of text and then combine this with the original wav. The part that makes this tricky (for me anyway) is: How do I convert the text string to a wav? How do I ensure that the resulting WAV form is inaudible (or at least barely noticeable to the listener). I would prefer this be done server side (via PHP, etc) but if the processing load isn't too much then would be ok with something in Flash or Javascript. I'd be willing to pay someone to create me a workable solution (complete source code that functions as described). Thanks, Scott!

    Read the article

  • Streaming support for flex with Ruby On Rails (working with live data )

    - by Ashine
    Hi Freinds, I am working on flex dasboards and charting stuff. Till now I have build them for static data only now I want to upgrade them to work for real time data where new data is continuosly sent to client (swf file) from server and it updates the same. I am using Ruby On Rails (RoR) at server side. Is there some thing similar to 'Adbobe live cycle(Java-Flex)' applicable for RoR-Flex architecture that can be used here ? Please share the links for any similar implementation in RoR-Flex architecture. Or if you have some suggestions to share I will really appreciate. Thanks friends.

    Read the article

  • html: embed streaming video (crossbrowser)

    - by Fuxi
    hi all, i'm trying to embed a .wmv video into my website but doesnt work :( i've googled and tried <embed> and <video> the video i'm using was created by super converter and i've used WMV7 as video codec. is there a crossbrowser solution for it - or should i better use flash video? thx in advance

    Read the article

  • How to configure the framesize using AudioUnit.framework on iOS

    - by Piperoman
    I have an audio app i need to capture mic samples to encode into mp3 with ffmpeg First configure the audio: /** * We need to specifie our format on which we want to work. * We use Linear PCM cause its uncompressed and we work on raw data. * for more informations check. * * We want 16 bits, 2 bytes (short bytes) per packet/frames at 8khz */ AudioStreamBasicDescription audioFormat; audioFormat.mSampleRate = SAMPLE_RATE; audioFormat.mFormatID = kAudioFormatLinearPCM; audioFormat.mFormatFlags = kAudioFormatFlagIsPacked | kAudioFormatFlagIsSignedInteger; audioFormat.mFramesPerPacket = 1; audioFormat.mChannelsPerFrame = 1; audioFormat.mBitsPerChannel = audioFormat.mChannelsPerFrame*sizeof(SInt16)*8; audioFormat.mBytesPerPacket = audioFormat.mChannelsPerFrame*sizeof(SInt16); audioFormat.mBytesPerFrame = audioFormat.mChannelsPerFrame*sizeof(SInt16); The recording callback is: static OSStatus recordingCallback(void *inRefCon, AudioUnitRenderActionFlags *ioActionFlags, const AudioTimeStamp *inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList *ioData) { NSLog(@"Log record: %lu", inBusNumber); NSLog(@"Log record: %lu", inNumberFrames); NSLog(@"Log record: %lu", (UInt32)inTimeStamp); // the data gets rendered here AudioBuffer buffer; // a variable where we check the status OSStatus status; /** This is the reference to the object who owns the callback. */ AudioProcessor *audioProcessor = (__bridge AudioProcessor*) inRefCon; /** on this point we define the number of channels, which is mono for the iphone. the number of frames is usally 512 or 1024. */ buffer.mDataByteSize = inNumberFrames * sizeof(SInt16); // sample size buffer.mNumberChannels = 1; // one channel buffer.mData = malloc( inNumberFrames * sizeof(SInt16) ); // buffer size // we put our buffer into a bufferlist array for rendering AudioBufferList bufferList; bufferList.mNumberBuffers = 1; bufferList.mBuffers[0] = buffer; // render input and check for error status = AudioUnitRender([audioProcessor audioUnit], ioActionFlags, inTimeStamp, inBusNumber, inNumberFrames, &bufferList); [audioProcessor hasError:status:__FILE__:__LINE__]; // process the bufferlist in the audio processor [audioProcessor processBuffer:&bufferList]; // clean up the buffer free(bufferList.mBuffers[0].mData); //NSLog(@"RECORD"); return noErr; } With data: inBusNumber = 1 inNumberFrames = 1024 inTimeStamp = 80444304 // All the time same inTimeStamp, this is strange However, the framesize that i need to encode mp3 is 1152. How can i configure it? If i do buffering, that implies a delay, but i would like to avoid this because is a real time app. If i use this configuration, each buffer i get trash trailing samples, 1152 - 1024 = 128 bad samples. All samples are SInt16.

    Read the article

  • Prefered method for looping sound flash as3

    - by Brian Heylin
    Hi there, I'm having some issues with looping a sound in flash AS3, in that when I tell the sound to loop I get a slight delay at the end/beginning of the audio. The audio is clipped correctly and will play without a gap on garage band. I know that there are issues with sound in general in flash, bugs with encodings and the inaccuracies with the SOUND_COMPLETE event (And Adobe should be embarrassed with their handling of these issues) I have tried to use the built in loop argument in the play method on the Sound class and also react on the SOUND_COMPLETE event, but both cause a delay. But has anyone come up with a technique for looping a sound without any noticeable gap?

    Read the article

  • Signal amplitude against time in Java

    - by wsr74ws84
    I'm racking my brain in order to solve a knotty problem (at least for me). While playing an audio file (using Java) I want the signal amplitude to be displayed against time. I mean I'd like to implement a small panel showing a sort of oscilloscope (spectrum analyzer). The audio signal should be viewed in the time domain (vertical axis is amplitude and the horizontal axis is time). Does anyone know how to do it? Is there a good tutorial I can rely on? Since I know very little about Java, I hope someone can help me.

    Read the article

  • http streaming using java servlet

    - by Shamik
    I have a servlet based web application which produces two sets of data. One set of data in the webpage which is essential and other set which is optional. I would like to render the essential data as fast as possible and then stream the optional data. I was thinking of writing the essential data to the output stream of HttpServletRequest and then call HttpServletRequest.flushBuffer() to commit the response to the client, but do not return from the servlet code, but instead create the optional data , write that to the outputstream again and then return from servlet code. What are the things that could go wrong in this scheme ? Is this a standard practice to achieve this goal?

    Read the article

  • Html5 - Callback when media is ready on iPad wont work

    - by Kap
    I'm trying to add a callback to a HTML5 audio element on an iPad. I added an eventlistener to the element, the myOtherThing() starts but there is no sound. If I pause and the play the sound again the audio starts. This works in chrome. Does anyone have an idea how I can do this? myAudioElement.src = "path_to_file"; addEventListener("canplay", function(){ myAudioElement.play(); myOtherThing.start(); }); SOLVED Just wanted to share my solution here, just in case someone else needs it. As far as I understand the iPad does not trigger any events without user interactions. So to be able to use "canply", "playing" and all the other events you need to use the built in media controller. Once you press play in that controller, the events gets triggered. After that you can use your custom interface.

    Read the article

  • Streaming PDF via invocation of HTTPHandler using $.get into object element

    - by mythicdawn
    What I am trying to do is invoke an HTTPHandler via the $.get method of jQuery which will stream back a PDF and display it in a web page using an object element. My previous method of setting the src attribute of an IFrame to be the result of a handler invocation works, but I would like cross-browser completion notification, so have moved to using $.get(). Sample code: function buttonClick() { $.get("/PDFHandler.ashx", {}, function(data, textStatus, XMLHttpRequest) { var pdfObjectString = "<object data='' type='application/pdf' width='600' height='600'></object>"; var pdfObject = $(pdfObjectString); pdfObject.attr("data", data); $("#container").append(pdfObject); }); As you can see, I am attempting to stick the 'data' variable into an object element. This is not working (no error, PDF just doesn't display), presumably because the data that comes back is binary, yet the attr() method expects a string (I think). My question is thus: how can I invoke an HTTPHandler via $.get and somehow assign the data from the callback to the data attribute of an object?

    Read the article

  • Python: HTTP Post a large file with streaming

    - by Daniel Von Fange
    I'm uploading potentially large files to a web server. Currently I'm doing this: import urllib2 f = open('somelargefile.zip','rb') request = urllib2.Request(url,f.read()) request.add_header("Content-Type", "application/zip") response = urllib2.urlopen(request) However, this reads the entire file's contents into memory before posting it. How can I have it stream the file to the server?

    Read the article

  • Streaming content to JSF UI

    - by Mark Lewis
    Hello, I was quite happy with my JSF app which read the contents of MQ messages received and supplied them to the UI like this: <rich:panel> <snip> <rich:panelMenuItem label="mylabel" action="#{MyBacking.updateCurrent}"> <f:param name="current" value="mylog.log" /> </rich:panelMenuItem> </snip> </rich:panel> <rich:panel> <a4j:outputPanel ajaxRendered="true"> <rich:insert content="#{MyBacking.log}" highlight="groovy" /> </a4j:outputPanel> </rich:panel> and in MyBacking.java private String logFile = null; ... public String updateCurrent() { FacesContext context=FacesContext.getCurrentInstance(); setCurrent((String)context.getExternalContext().getRequestParameterMap().get("current")); setLog(getCurrent()); return null; } public void setLog(String log) { sendMsg(log); msgBody = receiveMsg(moreargs); logFile = msgBody; } public String getLog() { return logFile; } until the contents of one of the messages was too big and tomcat fell over. Obviously, I thought, I need to change the way it works so that I return some form of stream so that no one object grows so big that the container dies and the content returned by successive messages is streamed to the UI as it comes in. Am I right in thinking that I can replace the work I'm doing now on a String object with a BufferedOutputStream object ie no change to the JSF code and something like this changing at the back end: private BufferedOutputStream logFile = null; public void setLog(String log) { sendMsg(args); logFile = (BufferedOutputStream) receiveMsg(moreargs); } public String getLog() { return logFile; }

    Read the article

< Previous Page | 62 63 64 65 66 67 68 69 70 71 72 73  | Next Page >