Search Results

Search found 9134 results on 366 pages for 'live streaming'.

Page 71/366 | < Previous Page | 67 68 69 70 71 72 73 74 75 76 77 78  | Next Page >

  • Cloud e-mail and portal integration: experiences?

    - by Mark McLaren
    I am evaluating cloud e-mail solutions based upon: Google Apps for Education Microsoft Live@edu I work for a University and we currently have an institutional portal (based on uPortal). We currently have our local IMAP server and webmail client fully integrated with the portal. We would like to replicate the current portal e-mail experience with the new e-mail services. At present users can see a snapshot of their inbox in the portal and click through into the appropriate place in the webmail client. We expect that we need to solve similar problems when integrating with the cloud based e-mail solutions. We need to solve the single sign-on (SSO) problem. We need to be able to access the inbox messages on the users behalf. (e.g. proxy authentication) Does anybody have an experience or advice on this? Many thanks, Mark

    Read the article

  • Stream (.NET) handling best-practices

    - by Jader Dias
    The question is entitled with the word "Stream" because the question below is a concrete example of a more generic doubt I have about Streams: I have a problem that accepts two solutions and I want to know the best one: I download a file, save it to disk (2 min), read it and write the contents to the DB (+ 2 min). I download a file and write the contents directly to the DB (3 min). If the write to DB fails I'll have to download again in the second case, but not in the first case. Which is best? Which would you use?

    Read the article

  • capturing video from ip camera

    - by Ruby
    I am trying to capture video from ip camera into my application , its giving exception com.sun.image.codec.jpeg.ImageFormatException: Not a JPEG file: starts with 0x0d 0x0a at sun.awt.image.codec.JPEGImageDecoderImpl.readJPEGStream(Native Method) at sun.awt.image.codec.JPEGImageDecoderImpl.decodeAsBufferedImage(Unknown Source) at test.AxisCamera1.readJPG(AxisCamera1.java:130) at test.AxisCamera1.readMJPGStream(AxisCamera1.java:121) at test.AxisCamera1.readStream(AxisCamera1.java:100) at test.AxisCamera1.run(AxisCamera1.java:171) at java.lang.Thread.run(Unknown Source) its giving exception at image = decoder.decodeAsBufferedImage(); Here is the code i am trying private static final long serialVersionUID = 1L; public boolean useMJPGStream = true; public String jpgURL = "http://ip here/video.cgi/jpg/image.cgi?resolution=640×480"; public String mjpgURL = "http://ip here /video.cgi/mjpg/video.cgi?resolution=640×480"; DataInputStream dis; private BufferedImage image = null; public Dimension imageSize = null; public boolean connected = false; private boolean initCompleted = false; HttpURLConnection huc = null; Component parent; /** Creates a new instance of AxisCamera */ public AxisCamera1(Component parent_) { parent = parent_; } public void connect() { try { URL u = new URL(useMJPGStream ? mjpgURL : jpgURL); huc = (HttpURLConnection) u.openConnection(); // System.out.println(huc.getContentType()); InputStream is = huc.getInputStream(); connected = true; BufferedInputStream bis = new BufferedInputStream(is); dis = new DataInputStream(bis); if (!initCompleted) initDisplay(); } catch (IOException e) { // incase no connection exists wait and try // again, instead of printing the error try { huc.disconnect(); Thread.sleep(60); } catch (InterruptedException ie) { huc.disconnect(); connect(); } connect(); } catch (Exception e) { ; } } public void initDisplay() { // setup the display if (useMJPGStream) readMJPGStream(); else { readJPG(); disconnect(); } imageSize = new Dimension(image.getWidth(this), image.getHeight(this)); setPreferredSize(imageSize); parent.setSize(imageSize); parent.validate(); initCompleted = true; } public void disconnect() { try { if (connected) { dis.close(); connected = false; } } catch (Exception e) { ; } } public void paint(Graphics g) { // used to set the image on the panel if (image != null) g.drawImage(image, 0, 0, this); } public void readStream() { // the basic method to continuously read the // stream try { if (useMJPGStream) { while (true) { readMJPGStream(); parent.repaint(); } } else { while (true) { connect(); readJPG(); parent.repaint(); disconnect(); } } } catch (Exception e) { ; } } public void readMJPGStream() { // preprocess the mjpg stream to remove the // mjpg encapsulation readLine(3, dis); // discard the first 3 lines readJPG(); readLine(2, dis); // discard the last two lines } public void readJPG() { // read the embedded jpeg image try { JPEGImageDecoder decoder = JPEGCodec.createJPEGDecoder(dis); image = decoder.decodeAsBufferedImage(); } catch (Exception e) { e.printStackTrace(); disconnect(); } } public void readLine(int n, DataInputStream dis) { // used to strip out the // header lines for (int i = 0; i < n; i++) { readLine(dis); } } public void readLine(DataInputStream dis) { try { boolean end = false; String lineEnd = "\n"; // assumes that the end of the line is marked // with this byte[] lineEndBytes = lineEnd.getBytes(); byte[] byteBuf = new byte[lineEndBytes.length]; while (!end) { dis.read(byteBuf, 0, lineEndBytes.length); String t = new String(byteBuf); System.out.print(t); // uncomment if you want to see what the // lines actually look like if (t.equals(lineEnd)) end = true; } } catch (Exception e) { e.printStackTrace(); } } public void run() { System.out.println("in Run..................."); connect(); readStream(); } @SuppressWarnings("deprecation") public static void main(String[] args) { JFrame jframe = new JFrame(); jframe.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE); AxisCamera1 axPanel = new AxisCamera1(jframe); new Thread(axPanel).start(); jframe.getContentPane().add(axPanel); jframe.pack(); jframe.show(); } } Any suggestions what I am doing wrong here??

    Read the article

  • Trying to build automatic audio-conferencing capability into a WebApp

    - by Keller
    Hey all, I'm working with a team of relatively novice programmers, and we are trying to create a site that will have audio-conferencing capabilities such that whenever someone visits the page, they will immediately have audio-conferencing capabilities with everyone else on the page (5 people max). Can anyone point us in a general direction? Should we be looking into building a custom app, leveraging audio conferencing software, or trying to mimic a webex program? Would Adobe Stratus be useful in getting this kind of functionality? Does anyone have any ideas about how we would design something like this on a macro level? Sorry for the noobish question, but any guidance would be deeply appreciated. Thanks, Keller

    Read the article

  • Recording Audio through RTMP/Rails

    - by Lowgain
    I am in the process of building a rails/flex application which requires audio to be recorded and then stored in our amazon s3 account. I have found no alternative to using some form of RTMP server for recording audio through flash, but our hosting environment will not allow us to install anything like FMS, Red5, etc. Is there any existing Ruby/Rails RTMP solution that will allow audio recording? If not, is it possible for Rails to at least intercept the RTMP stream and then I can hope to reference red5 or something for parsing the data (long shot, I know)? The other alternative I can think of is hosting a red5 server on another host and communicating with our rails app once the saving/uploading is done, which is not preferred. Am I going to have any luck here?

    Read the article

  • help me pick the right iPhone audio class - MPMoviePlayer vs AVAudioPlayer vs MPMusicPlayer

    - by huevos de oro
    Does anyone know of a good tutorial on the distinction between the MPMoviePlayer vs AVAudioPlayer vs MPMusicPlayer? I want to play audio from an mp3 file available at an external URL. Ideally it is played in an iPod-like audio view. I toyed with MPMoviePlayer but it appears to be more suitable for video, as when audio starts a "movie playing" message displays, the controls disappear and a white quicktime splash page displays. I would like the standard ipod audio controls to display all the time, and to customize the image behind them.

    Read the article

  • XMLStreamReader and a real stream

    - by Yuri Ushakov
    Update There is no ready XML parser in Java community which can do NIO and XML parsing. This is the closest I found, and it's incomplete: http://wiki.fasterxml.com/AaltoHome I have the following code: InputStream input = ...; XMLInputFactory xmlInputFactory = XMLInputFactory.newInstance(); XMLStreamReader streamReader = xmlInputFactory.createXMLStreamReader(input, "UTF-8"); Question is, why does the method #createXMLStreamReader() expects to have an entire XML document in the input stream? Why is it called a "stream reader", if it can't seem to process a portion of XML data? For example, if I feed: <root> <child> to it, it would tell me I'm missing the closing tags. Even before I begin iterating the stream reader itself. I suspect that I just don't know how to use a XMLStreamReader properly. I should be able to supply it with data by pieces, right? I need it because I'm processing a XML stream coming in from network socket, and don't want to load the whole source text into memory. Thank you for help, Yuri.

    Read the article

  • Pros and cons of MPMoviePlayerController versus launching UIWebView to stream movie

    - by Nosredna
    I have a client who has video content for the web in Flash format. My task is to help them show the videos in an iPhone app. I realize that step one is to get these videos into the appropriate Quicktime format for the iPhone. Then I'm going to have to help the client figure out how or where to host these files. If that's tricky I assume they can be hosted at YouTube. My chief concern, though, is which approach to take to stream the video. What are the pros and cons of MPMoviePlayerController versus launching UIWebView with the URL of the stream? Is there any difference? Is one of them more or less forgiving? Is one of them a better user experience? Any gotchas I might expect to run into? I'm assuming playing video is pretty easy on the iPhone. Is it reasonable to try both and have one available as a fallback, or would that be a waste of time? I'm trying to schedule this out a bit, so I'd love to hear real-world experiences from anyone who's done this.

    Read the article

  • Collecting high-volume video viewing data

    - by DanK
    I want to add tracking to our Flash-based media player so that we can provide analytics that show what sections of videos are being watched (at the moment, we just register a view when a video starts playing) For example, if a viewer watches the first 30 seconds of a video and then clicks away to something else, we want the data to reflect that. Likewise, if someone watches the first 10 seconds, then scrubs the timeline to the last minute of the video and watches that, we want to register viewing on the parts watched and not the middle section. My first thought was to collect up the viewing data in the player and send it all to the server at the end of a viewing session. Unfortunately, Flash does not seem to have an event that you can hook into when a viewer clicks away from the page the movie is on (probably a good thing - it would be open to abuse) So, it looks like we're going to have to make regular requests to the server as the video is playing. This is obviously going to lead to a high volume of requests when there are large numbers of simultaneous viewers. The simple approach of dumping all these 'heartbeat' events from clients to a database feels like it will quickly become unmanageable so I'm wondering whether I should be taking an approach where viewing sessions are cached in memory and flushed to database when they become inactive (based on a timeout). That way, the data could be stored as time spans rather than individual heartbeats. So, to the question - what is the best way to approach dealing with this kind of high-volume viewing data? Are there any good existing architectures/patterns? Thanks, Dan.

    Read the article

  • JWPlayer plays videos at 3x speed on the first run, then works fine in subsequent runs on the same p

    - by Josiah Kiehl
    Go here: http://nano.materials.drexel.edu/research/videolibrary For some bizarre reason, the videos will play at 3x speed on the first run through, but then will play at normal speed each subsequent play. This doesn't happen all the time, and it's not always the same video(s) that do it. I'm utterly baffled. I've reconverted the videos from m4p to flv (using BitComet's converter) several times, double checking the settings each time through with no change to the behavior. Anyone have a clue what's going on?

    Read the article

  • Android Stream Data Over Wifi?

    - by Neb
    Im trying to make an app for android that will stream the data of the accelerometer to be used as a game controller on my pc over a local wifi connection. Is it possible to make some kind of wifi stream of the accelerometer values in the android app and then make the pc somehow 'read' this stream? Or would it just be better for the pc to make endless calls to the phone getting the newest accelerometer values from a local android server? It would also have to send commands from the phone such as 'button1 pressed', 'button1 released'.

    Read the article

  • Stream writing lags my GUI

    - by blez
    I have a thread that dequeues data from a queue and write it to another application's STDIN. I'm using Stream, but with .Write and even .BeginWrite, when I send 1mb chunks to the second app, my GUI gets laggy for ~1sec. Why?

    Read the article

  • Loading a page into memory in Rails

    - by titaniumdecoy
    My rails app produces XML when I load /reports/generate_report.xml. On a separate page, I want to read this XML into a variable and save it to the database. How can I do this? Can I somehow stream the response from the /reports/generate_report.xml URI into a variable? Or is there a better way to do it since the XML is produced by the same web app?

    Read the article

  • is it possible to display video information from an rtsp stream in an android app UI

    - by Joseph Cheung
    I have managed to get a working video player that can stream rtsp links, however im not sure how to display the videos current time position in the UI, i have used the getDuration and getCurrentPosition calls, stored this information in a string and tried to display it in the UI but it doesnt seem to work in main.xml: TextView android:id="@+id/player" android:layout_width="wrap_content" android:layout_height="wrap_content" android:layout_margin="1px" android:text="@string/cpos" / in strings.xml: string name="cpos""" /string in Player.java private void playVideo(String url) { try { media.setEnabled(false); if (player == null) { player = new MediaPlayer(); player.setScreenOnWhilePlaying(true); } else { player.stop(); player.reset(); } player.setDataSource(url); player.getCurrentPosition(); player.setDisplay(holder); player.setAudioStreamType(AudioManager.STREAM_MUSIC); player.setOnPreparedListener(this); player.prepareAsync(); player.setOnBufferingUpdateListener(this); player.setOnCompletionListener(this); } catch (Throwable t) { Log.e(TAG, "Exception in media prep", t); goBlooey(t); try { try { player.prepare(); } catch (IOException e) { // TODO Auto-generated catch block e.printStackTrace(); } Log.v(TAG, "Duration: === " + player.getDuration()); } catch (IllegalStateException e) { // TODO Auto-generated catch block e.printStackTrace(); } } } private Runnable onEverySecond = new Runnable() { public void run() { if (lastActionTime 0 && SystemClock.elapsedRealtime() - lastActionTime 3000) { clearPanels(false); } if (player != null) { timeline.setProgress(player.getCurrentPosition()); //stores getCurrentPosition as a string cpos = String.valueOf(player.getCurrentPosition()); System.out.print(cpos); } if (player != null) { timeline.setProgress(player.getDuration()); //stores getDuration as a string cdur = String.valueOf(player.getDuration()); System.out.print(cdur); } if (!isPaused) { surface.postDelayed(onEverySecond, 1000); } } };

    Read the article

  • getting started with libmms

    - by Vnuce
    Actually, the title explains it all... I want to read a stream, but have no idea from where to start. I've searched the web for some documentation/tutorial/whatever with no luck. Any help using this lib would be very appreciated.

    Read the article

  • How do you handle large repeated UI elements with JQuery

    - by jpoz
    Howdy, Here's the situation: You have a very complex UI element that is repeated in a list. Each has a menu on it, buttons, it hides and shows subelements, buttons for switch it's state, etc, etc. The elements are populated via JSON so you have to construct the elements and the functionality of the fly. What's the best way to accomplish this with JQuery? Where would you save the reusable template for the DOM structure? How would you add the behavior on? $().live? .livequery? onclick? manual after every JSON get? I guess I just see a lot of people doing different things. What's your experience with performance? Any insight would be much appreciated. Thanks, JPoz

    Read the article

< Previous Page | 67 68 69 70 71 72 73 74 75 76 77 78  | Next Page >