Search Results

Search found 12015 results on 481 pages for 'video capture'.

Page 32/481 | < Previous Page | 28 29 30 31 32 33 34 35 36 37 38 39  | Next Page >

  • Cutting up videos (excerpting) on Mac OS X -- iMovie produces super-large files

    - by markvgti
    I need to cut out parts of a video (+ the associated audio, of course) to make a short clip. For example, take 2 minutes from one location, 3 minutes from another part of the video, 30 seconds from another location and join it all together to form one single clip. The format of the input video is mp4 (H.264 encoding, AFAICR). Don't need very sophisticated merges or transitions from one part to the next, or sophisticated banners (text) on-screen, but some ability to do so would be a plus point. I've done this with iMovie in the past, but where the original file was under 5MB/min of play time, the chopped-up version was over 11MB/min of play time, which to me seems really bad. Is there a better/different way of doing this on OS X? Looking for free (gratis) solutions. OS: OS X 10.9.3

    Read the article

  • Resizing video best practices (frame size)

    - by undefined
    I have read the following which is from Best Practices for Encoding Video with the VP6 Codec on the Adobe website here - http://www.adobe.com/devnet/flash/articles/encoding_video_print.html. It is talking about common video ratios (320x240, 640x480) Although these ratios are standard, and should be used to avoid distorting the video, the size of the encoded video is not set in stone. The original web video sizes used heights and widths that were evenly divisible by 16. This was mandatory for many early codecs. Although this is not necessary for modern codecs, you should stick to even heights and widths. What do they mean by 'even heights and widths'. I am thinking about encoding my video at 400x300 to make it slightly bigger, this is still 4x3 format but should I just stick at 320x240 and resize it on the screen? Clearly there are benefits to this in terms of storage size and delivery costs. In some places on my site I want to show the video at 400x300 but in others I want it to play full screen so this is why I am wondering if a larger original size (400x300) will give better results when blown up. Any thoughts?

    Read the article

  • DirectShow Filter I wrote dies after 10-24 seconds in Skype video call

    - by Robert Oschler
    I've written a DirectShow push filter for use with Skype using Delphi Pro 6 and the DSPACK DirectShow library. In preview mode, when you test a video input device in the Skype client Video Settings window, my filter works flawlessly. I can leave it up and running for many minutes without an error. However when I start a video call after 10 to 24 seconds, never longer, the video feed freezes. The call continues fine with the call duration counter clicking away the seconds, but the video feed is dead, stuck on whatever frame the freeze happened (although after a long while it turns black which I believe means Skype has given up on the filter). I tried attaching to the process from my debugger with a breakpoint literally set on every method call and none of them are hit once the freeze takes place. It's as if the thread that makes the DirectShow FillBuffer() call to my filter on behalf of Skype is dead or has been shutdown. I can't trace my filter in the debugger because during a Skype call I get weird int 1 and int 3 debugger hard interrupt calls when a Skype video call is in progress. This behavior happens even with my standard web cam input device selected and my DirectShow filter completely unregistered as a ActiveX server. I suspect it might be some "anti-debugging" code since it doesn't happen in video input preview mode. Either way, that is why I had to attach to the process after the fact to see if my FillBuffer() called was still being called and instead discovered that appears to be dead. Note, my plain vanilla USB web cam's DirectShow filter does not exhibit the freezing behavior and works fine for many minutes. There's something about my filter that Skype doesn't like. I've tried Sleep() statements of varying intervals, no Sleep statements, doing virtually nothing in the FillBuffer() call. Nothing helps. If anyone has any ideas on what might be the culprit here, I'd like to know. Thanks, Robert

    Read the article

  • HTML5 Video Element on iPad doesn't fire onclick?

    - by bhups
    I am using the video element in my HTML as following:<div id="container" style="background:black; overflow:hidden;width:320px;height:240px" <video style="background:black;display:block" id="vdo" height="240px" width="320px" src="http://mydomain/vid.mp4"</video</div And in javascript I am doing this:var video=document.getElementById('vdo'); var container=document.getElementById('container'); video.addEventListener('click', function(e) { e.preventDefault(); console.log("clicked"); }, false); container.addEventListener('click', function(e) { e.preventDefault(); console.log("clicked"); }, false); On desktop safari/chrome everything is working fine. I can see two "clicked" in the console. But on ipad there is nothing. First I tried with iOS versin 3.2, then I updated it to the latest one 4.2.1 without any success.I found a similar question HTML5 Video Element on iPad doesn't fire onclick or touchstart events? where it suggests not to use controls in video tag and I am not using it.

    Read the article

  • SharpPcap - A Packet Capture retring seding messesge problem.

    - by Eyla
    I trying to capture packets using SharpPcap library. I'm able to return the packets details but I'm having problem to get what the message content inside the packet. the packet using .Data to return the message and when I use it it is returning (System.Byte[]). here is the library website: http://www.codeproject.com/KB/IP/sharppcap.aspx here is my code: string packetData; private void packetCapturingThreadMethod() { Packet packet = null; int countOfPacketCaptures = 0; while ((packet = device.GetNextPacket()) != null) { packet = device.GetNextPacket(); if (packet is TCPPacket) { TCPPacket tcp = (TCPPacket)packet; myPacket tempPacket = new myPacket(); tempPacket.packetType = "TCP"; tempPacket.sourceAddress = Convert.ToString(tcp.SourceAddress); tempPacket.destinationAddress = Convert.ToString(tcp.DestinationAddress); tempPacket.sourcePort = Convert.ToString(tcp.SourcePort); tempPacket.destinationPort = Convert.ToString(tcp.DestinationPort); tempPacket.packetMessage = Convert.ToString(tcp.Data); packetsList.Add(tempPacket); packetData = "Type= TCP" + " Source Address = "+ Convert.ToString(tcp.SourceAddress)+ " Destination Address =" +Convert.ToString(tcp.DestinationAddress)+ " SourcePort =" + Convert.ToString(tcp.SourcePort)+ " SourcePort =" +Convert.ToString(tcp.DestinationPort)+ " Messeage =" + Convert.ToString(tcp.Data); txtpackets.Invoke(new UpdatetxtpacketsCallback(this.Updatetxtpackets), new object[] { packetData }); string[] row = { packetsList[countOfPacketCaptures].packetType, packetsList[countOfPacketCaptures].sourceAddress, packetsList[countOfPacketCaptures].destinationAddress, packetsList[countOfPacketCaptures].sourcePort, packetsList[countOfPacketCaptures].destinationPort, packetsList[countOfPacketCaptures].packetMessage }; try { //dgwPacketInfo.Rows.Add(row); countOfPacketCaptures++; //lblCapturesLabels.Text = Convert.ToString(countOfPacketCaptures); } catch (Exception e) { } } else if (packet is UDPPacket) { UDPPacket udp = (UDPPacket)packet; myPacket tempPacket = new myPacket(); tempPacket.packetType = "UDP"; tempPacket.sourceAddress = Convert.ToString(udp.SourceAddress); tempPacket.destinationAddress = Convert.ToString(udp.DestinationAddress); tempPacket.sourcePort = Convert.ToString(udp.SourcePort); tempPacket.destinationPort = Convert.ToString(udp.DestinationPort); tempPacket.packetMessage = udp.Data.ToArray() + "\n"; packetsList.Add(tempPacket); packetData = "Type= UDP" + " Source Address = "+ Convert.ToString(udp.SourceAddress)+ " Destination Address =" +Convert.ToString(udp.DestinationAddress)+ " SourcePort =" + Convert.ToString(udp.SourcePort)+ " SourcePort =" +Convert.ToString(udp.DestinationPort)+ " Messeage =" + udp.Data.ToArray() + "\n"; string[] row = { packetsList[countOfPacketCaptures].packetType, packetsList[countOfPacketCaptures].sourceAddress, packetsList[countOfPacketCaptures].destinationAddress, packetsList[countOfPacketCaptures].sourcePort, packetsList[countOfPacketCaptures].destinationPort, packetsList[countOfPacketCaptures].packetMessage }; try { //dgwPacketInfo.Rows.Add(row); //countOfPacketCaptures++; //lblCapturesLabels.Text = Convert.ToString(countOfPacketCaptures); txtpackets.Invoke(new UpdatetxtpacketsCallback(this.Updatetxtpackets), new object[] { packetData }); } catch (Exception e) { } } } }

    Read the article

  • SharpPcap - A Packet Capture getting messesge problem.

    - by Eyla
    I trying to capture packets using SharpPcap library. I'm able to return the packets details but I'm having problem to get what the message content inside the packet. the packet using .Data to return the message and when I use it it is returning (System.Byte[]). here is the library website: http://www.codeproject.com/KB/IP/sharppcap.aspx here is my code: string packetData; private void packetCapturingThreadMethod() { Packet packet = null; int countOfPacketCaptures = 0; while ((packet = device.GetNextPacket()) != null) { packet = device.GetNextPacket(); if (packet is TCPPacket) { TCPPacket tcp = (TCPPacket)packet; myPacket tempPacket = new myPacket(); tempPacket.packetType = "TCP"; tempPacket.sourceAddress = Convert.ToString(tcp.SourceAddress); tempPacket.destinationAddress = Convert.ToString(tcp.DestinationAddress); tempPacket.sourcePort = Convert.ToString(tcp.SourcePort); tempPacket.destinationPort = Convert.ToString(tcp.DestinationPort); tempPacket.packetMessage = Convert.ToString(tcp.Data); packetsList.Add(tempPacket); packetData = "Type= TCP" + " Source Address = "+ Convert.ToString(tcp.SourceAddress)+ " Destination Address =" +Convert.ToString(tcp.DestinationAddress)+ " SourcePort =" + Convert.ToString(tcp.SourcePort)+ " SourcePort =" +Convert.ToString(tcp.DestinationPort)+ " Messeage =" + Convert.ToString(tcp.Data); txtpackets.Invoke(new UpdatetxtpacketsCallback(this.Updatetxtpackets), new object[] { packetData }); string[] row = { packetsList[countOfPacketCaptures].packetType, packetsList[countOfPacketCaptures].sourceAddress, packetsList[countOfPacketCaptures].destinationAddress, packetsList[countOfPacketCaptures].sourcePort, packetsList[countOfPacketCaptures].destinationPort, packetsList[countOfPacketCaptures].packetMessage }; try { //dgwPacketInfo.Rows.Add(row); countOfPacketCaptures++; //lblCapturesLabels.Text = Convert.ToString(countOfPacketCaptures); } catch (Exception e) { } } else if (packet is UDPPacket) { UDPPacket udp = (UDPPacket)packet; myPacket tempPacket = new myPacket(); tempPacket.packetType = "UDP"; tempPacket.sourceAddress = Convert.ToString(udp.SourceAddress); tempPacket.destinationAddress = Convert.ToString(udp.DestinationAddress); tempPacket.sourcePort = Convert.ToString(udp.SourcePort); tempPacket.destinationPort = Convert.ToString(udp.DestinationPort); tempPacket.packetMessage = udp.Data.ToArray() + "\n"; packetsList.Add(tempPacket); packetData = "Type= UDP" + " Source Address = "+ Convert.ToString(udp.SourceAddress)+ " Destination Address =" +Convert.ToString(udp.DestinationAddress)+ " SourcePort =" + Convert.ToString(udp.SourcePort)+ " SourcePort =" +Convert.ToString(udp.DestinationPort)+ " Messeage =" + udp.Data.ToArray() + "\n"; string[] row = { packetsList[countOfPacketCaptures].packetType, packetsList[countOfPacketCaptures].sourceAddress, packetsList[countOfPacketCaptures].destinationAddress, packetsList[countOfPacketCaptures].sourcePort, packetsList[countOfPacketCaptures].destinationPort, packetsList[countOfPacketCaptures].packetMessage }; try { //dgwPacketInfo.Rows.Add(row); //countOfPacketCaptures++; //lblCapturesLabels.Text = Convert.ToString(countOfPacketCaptures); txtpackets.Invoke(new UpdatetxtpacketsCallback(this.Updatetxtpackets), new object[] { packetData }); } catch (Exception e) { } } } }

    Read the article

  • What is the best Video Conference integrated solution for Us? [closed]

    - by Andrei B
    we are trying to integrate a simple Video Conferencing (open source) solution into our existing application which is written in C++ and it runs on Linux. I am currently looking at using Ekiga (formely known as GnomeMeeting) or Homer Conferencing (short: Homer). My plan is to "integrate" an existing Video Conferencing client into our existing software. Please give me recommendation on which 3rd party application or library to use to add video conferencing feature to our application. PS: Please don't close this question. I asked it on StackOverflow and it got closed, so where am I supposed to ask this question? If not here, then whats the point of asking lol.

    Read the article

  • What is the best video editor on Ubuntu for editing multi-format/multi-eco?

    - by amjad
    I have many video clips with different formats and I want to create a short movies by joining them. Most of the clips are taken with Nokia N8 and Lumia 800. I am using Ubuntu, I have tried many of editors but I can not edit/video clips due to different video formats. Which open source editor should I use to achieve following tasks: Join different vidoe formats/encoding to produce on single movie Export to different formats (.avi, youtube, etc) Add texts on the clips and insert images I don't want to install and try many of them.

    Read the article

  • How to make audio and video streaming servers work?

    - by Santosh Linkha
    I am PHP MySQL developer and I am interested in the way television and radio are broadcasted over Internet live. I want to know how it works and and what are its requirements (which package of which programming language offers the best). And please clarify me: Websites are stored in servers. From my desktop, if I want to broadcast some video, then I need to connect to webserver (to upstream the video). Is there an application to do that (or do I have to code that or embed in my web application and which programming language would be suitable (does Python support that))? And I also need a script to handle the upstreamed video or audio (can I do that with PHP)?

    Read the article

  • Ubuntu raid 1 write errors

    - by Micah
    I have an Ubuntu server set up with two SATA drives in a RAID 1 configuration with MDADM. The machine is used to record raw video, which involves a lot of writing to the disk. Sometimes during video recording the computer will crash, will the following errors in kern.log: Mar 15 10:39:41 video kernel: [414501.629864] ata2.00: exception Emask 0x10 SAct 0x0 SErr 0x400100 action 0x6 Mar 15 10:39:41 video kernel: [414501.629870] ata2.00: BMDMA stat 0x26 Mar 15 10:39:41 video kernel: [414501.629875] ata2.00: SError: { UnrecovData Handshk } Mar 15 10:39:41 video kernel: [414501.629880] ata2.00: failed command: WRITE DMA EXT Mar 15 10:39:41 video kernel: [414501.629889] ata2.00: cmd 35/00:00:28:6d:f6/00:04:06:00:00/e0 tag 0 dma 524288 out Mar 15 10:39:41 video kernel: [414501.629891] res 51/84:b1:77:6e:f6/84:02:06:00:00/e0 Emask 0x30 (host bus error) Mar 15 10:39:41 video kernel: [414501.629896] ata2.00: status: { DRDY ERR } Mar 15 10:39:41 video kernel: [414501.629899] ata2.00: error: { ICRC ABRT } Mar 15 10:39:41 video kernel: [414501.629910] ata2.00: hard resetting link Mar 15 10:39:41 video kernel: [414501.973009] ata2.01: hard resetting link Mar 15 10:39:41 video kernel: [414502.482642] ata2.00: SATA link up 3.0 Gbps (SStatus 123 SControl 300) Mar 15 10:39:41 video kernel: [414502.482658] ata2.01: SATA link down (SStatus 0 SControl 300) Mar 15 10:39:41 video kernel: [414502.546160] ata2.00: configured for UDMA/133 Mar 15 10:39:41 video kernel: [414502.546203] ata2: EH complete Is this the result of faulty drives? Is software RAID just not performant enough for data rates ~15 MB/s, even with a quad-core i7? Thanks for your help. Edit: cat /proc/mdstat returns this: Personalities : [linear] [multipath] [raid0] [raid1] [raid6] [raid5] [raid4] [raid10] md0 : active raid1 sdb1[1] sda1[0] 976760768 blocks [2/2] [UU] unused devices: <none>

    Read the article

  • jQuery plugin for html5 video instream ads?

    - by dar
    Are there any jQuery plugins for html5 video to insert instream ads into the video? Most of the flash ones work by pausing the video at a particular timestamp, and playing an ad stream, then resuming the original video stream. The jquery captions plugin for html5 seems very similar, but different.

    Read the article

  • Streaming and conversion of video from 3rd Parth

    - by Ashish
    Hi, I am working on a App where video has to be displayed.All these video are in .flv format, Is there any mechanism by using that I can convert this video to .mov or .m4v (supported by iphone) on the fly, so that user can view those video on their iphone or ipod. Thanks, Ashish

    Read the article

  • jQuery, checking to see if video has height/width

    - by Mark
    I have a <video> element that is generated by js, and I need to get the height and width of it. var v = $('video'); v.height() returns null, because when it's run, the video hasn't yet been loaded, so no dimensions in the DOM. How do I check to see if the video has received a dimension, and if it hasn't, wait till it has to get the height and width. Thanks.

    Read the article

  • Streaming video using non standard protocols

    - by Dan
    Hi - My company makes DVRs that specialize in streaming live and recorded video. The video is encoded using standard MPEG-4 codecs so the codecs in Android should have no trouble with them. However, the video is recorded using non-standard file formats and is streamed using our proprietary protocol (among other things we embed additional data such as watermarks with each frame of video). Is there any way I can take my stream pull out the frames and have it play on an Android device? Thanks!

    Read the article

  • Howto play video with external audio in Silverlight?

    - by Fury
    Hi all, Is there any proper method to play synchronously video and external audio, other than simply having two MediaElement (one for video source and one for audio) started simultaneously? I need to play video with different soundtracks, but I belive that just two separated MediaElements will be out of sync at some point of time. Maybe there is some way to add audio source to the existing MediaElement with video? Platform: SL3, but SL4 will be good as well. Thanks in advance.

    Read the article

  • Webcam capture and convert to avi

    - by Spidfire
    Im trying to make a program that captures a video from the webcam and sound from the microphone but im getting stuck at the part where ive try to make a movie out of still images ive heard you need to use directshow but it doesnt jet work for me Does someone know a good piece of example code that captures video and sound and can encode it to a file (divx or something like that) ? or some suggestions where to look so i can build it myself (if a other programming language is better for this im happy to know it early. )

    Read the article

  • Webcam capture with c# and convert to avi

    - by Spidfire
    Im trying to make a program that captures a video from the webcam and sound from the microphone but im getting stuck at the part where ive try to make a movie out of still images ive heard you need to use directshow but it doesnt jet work for me Does someone know a good piece of example code that captures video and sound and can encode it to a file (divx or something like that) ? or some suggestions where to look so i can build it myself (if a other programming language is better for this im happy to know it early. )

    Read the article

  • video player for HTML 5 page not loading

    - by philippe
    I'm using VideoJS to as my video player for a project I've been working on. Basically I have a div, and I wanted to have the video player within that div, however when I load the page nothing happens, and the video is never played. In fact, the video is never loaded nor shown in the page. I basically copied the example from VideoJS' page. Any thoughts? <div class="video-js-box"> <!-- Using the Video for Everybody Embed Code http://camendesign.com/code/video_for_everybody --> <div style="position: absolute; top: 50px; left: 600px; display:none"> <video id="example_video_1" class="video-js vjs-default-skin" controls preload="auto" width="640" height="264" poster="http://video-js.zencoder.com/oceans-clip.png" data-setup='{"example_option":true}'> <source src="http://video-js.zencoder.com/oceans-clip.mp4" type='video/mp4'></source> <source src="http://video-js.zencoder.com/oceans-clip.webm" type='video/webm'>></source> <source src="http://video-js.zencoder.com/oceans-clip.ogv" type='video/ogg'></source> </video> <!-- Download links provided for devices that can't play video in the browser. --> <p class="vjs-no-video"><strong>Download Video:</strong> <a href="http://video-js.zencoder.com/oceans-clip.mp4">MP4</a>, <a href="http://video-js.zencoder.com/oceans-clip.webm">WebM</a>, <a href="http://video-js.zencoder.com/oceans-clip.ogv">Ogg</a><br> <!-- Support VideoJS by keeping this link. --> <a href="http://videojs.com">HTML5 Video Player</a> by VideoJS </p> </div> <div style="clear:both;"></div> </div><!--main-->

    Read the article

  • Detect End of Video Playback in Web Page

    - by Eric J.
    Is there a widely supported video playback technology for web pages that provides an event/hook that can be captured from Javascript when playback reaches the end of the stream? My goal is to provide a web page that plays a video and then asks the user a question about the video once playback is complete. The question would be hidden or disabled until they have actually viewed the video.

    Read the article

  • How to capture live camera frames in RGB with DirectShow

    - by Jonny Boy
    I'm implementing live video capture through DirectShow for live processing and display. (Augmented Reality app). I can access the pixels easily enough, but it seems I can't get the SampleGrabber to provide RGB data. The device (an iSight -- running VC++ Express in VMWare) only reports MEDIASUBTYPE_YUY2. After extensive Googling, I still can't figure out whether DirectShow is supposed to provide built-in color space conversion for this sort of thing. Some sites report that there is no YUV<-RGB conversion built in, others report that you just have to call SetMediaType on your ISampleGrabber with an RGB subtype. Any advice is greatly appreciated, I'm going nuts on this one. Code provided below. Please note that The code works, except that it doesn't provide RGB data I'm aware that I can implement my own conversion filter, but this is not feasible because I'd have to anticipate every possible device format, and this is a relatively small project // Playback IGraphBuilder *pGraphBuilder = NULL; ICaptureGraphBuilder2 *pCaptureGraphBuilder2 = NULL; IMediaControl *pMediaControl = NULL; IBaseFilter *pDeviceFilter = NULL; IAMStreamConfig *pStreamConfig = NULL; BYTE *videoCaps = NULL; AM_MEDIA_TYPE **mediaTypeArray = NULL; // Device selection ICreateDevEnum *pCreateDevEnum = NULL; IEnumMoniker *pEnumMoniker = NULL; IMoniker *pMoniker = NULL; ULONG nFetched = 0; HRESULT hr = CoInitializeEx(NULL, COINIT_MULTITHREADED); // Create CreateDevEnum to list device hr = CoCreateInstance(CLSID_SystemDeviceEnum, NULL, CLSCTX_INPROC_SERVER, IID_ICreateDevEnum, (PVOID *)&pCreateDevEnum); if (FAILED(hr)) goto ReleaseDataAndFail; // Create EnumMoniker to list devices hr = pCreateDevEnum->CreateClassEnumerator(CLSID_VideoInputDeviceCategory, &pEnumMoniker, 0); if (FAILED(hr)) goto ReleaseDataAndFail; pEnumMoniker->Reset(); // Find desired device while (pEnumMoniker->Next(1, &pMoniker, &nFetched) == S_OK) { IPropertyBag *pPropertyBag; TCHAR devname[256]; // bind to IPropertyBag hr = pMoniker-&gt;BindToStorage(0, 0, IID_IPropertyBag, (void **)&amp;pPropertyBag); if (FAILED(hr)) { pMoniker-&gt;Release(); continue; } VARIANT varName; VariantInit(&amp;varName); HRESULT hr = pPropertyBag-&gt;Read(L"DevicePath", &amp;varName, 0); if (FAILED(hr)) { pMoniker-&gt;Release(); pPropertyBag-&gt;Release(); continue; } char devicePath[DeviceInfo::STRING_LENGTH_MAX] = ""; wcstombs(devicePath, varName.bstrVal, DeviceInfo::STRING_LENGTH_MAX); if (strcmp(devicePath, deviceId) == 0) { // Bind Moniker to Filter pMoniker-&gt;BindToObject(0, 0, IID_IBaseFilter, (void**)&amp;pDeviceFilter); break; } pMoniker-&gt;Release(); pPropertyBag-&gt;Release(); } if (pDeviceFilter == NULL) goto ReleaseDataAndFail; // Create sample grabber IBaseFilter *pGrabberF = NULL; hr = CoCreateInstance(CLSID_SampleGrabber, NULL, CLSCTX_INPROC_SERVER, IID_IBaseFilter, (void**)&pGrabberF); if (FAILED(hr)) goto ReleaseDataAndFail; hr = pGrabberF->QueryInterface(IID_ISampleGrabber, (void**)&pGrabber); if (FAILED(hr)) goto ReleaseDataAndFail; // Create FilterGraph hr = CoCreateInstance(CLSID_FilterGraph, NULL, CLSCTX_INPROC, IID_IGraphBuilder, (LPVOID *)&pGraphBuilder); if (FAILED(hr)) goto ReleaseDataAndFail; // create CaptureGraphBuilder2 hr = CoCreateInstance(CLSID_CaptureGraphBuilder2, NULL, CLSCTX_INPROC, IID_ICaptureGraphBuilder2, (LPVOID *)&pCaptureGraphBuilder2); if (FAILED(hr)) goto ReleaseDataAndFail; // set FilterGraph hr = pCaptureGraphBuilder2->SetFiltergraph(pGraphBuilder); if (FAILED(hr)) goto ReleaseDataAndFail; // get MediaControl interface hr = pGraphBuilder->QueryInterface(IID_IMediaControl, (LPVOID *)&pMediaControl); if (FAILED(hr)) goto ReleaseDataAndFail; // Add filters hr = pGraphBuilder->AddFilter(pDeviceFilter, L"Device Filter"); if (FAILED(hr)) goto ReleaseDataAndFail; hr = pGraphBuilder->AddFilter(pGrabberF, L"Sample Grabber"); if (FAILED(hr)) goto ReleaseDataAndFail; // Set sampe grabber options AM_MEDIA_TYPE mt; ZeroMemory(&mt, sizeof(AM_MEDIA_TYPE)); mt.majortype = MEDIATYPE_Video; mt.subtype = MEDIASUBTYPE_RGB32; hr = pGrabber->SetMediaType(&mt); if (FAILED(hr)) goto ReleaseDataAndFail; hr = pGrabber->SetOneShot(FALSE); if (FAILED(hr)) goto ReleaseDataAndFail; hr = pGrabber->SetBufferSamples(TRUE); if (FAILED(hr)) goto ReleaseDataAndFail; // Get stream config interface hr = pCaptureGraphBuilder2->FindInterface(NULL, &MEDIATYPE_Video, pDeviceFilter, IID_IAMStreamConfig, (void **)&pStreamConfig); if (FAILED(hr)) goto ReleaseDataAndFail; int streamCapsCount = 0, capsSize, bestFit = -1, bestFitPixelDiff = 1000000000, desiredPixelCount = _width * _height, bestFitWidth = 0, bestFitHeight = 0; float desiredAspectRatio = (float)_width / (float)_height; hr = pStreamConfig->GetNumberOfCapabilities(&streamCapsCount, &capsSize); if (FAILED(hr)) goto ReleaseDataAndFail; videoCaps = (BYTE *)malloc(capsSize * streamCapsCount); mediaTypeArray = (AM_MEDIA_TYPE **)malloc(sizeof(AM_MEDIA_TYPE *) * streamCapsCount); for (int i = 0; i < streamCapsCount; i++) { hr = pStreamConfig->GetStreamCaps(i, &mediaTypeArray[i], videoCaps + capsSize * i); if (FAILED(hr)) continue; VIDEO_STREAM_CONFIG_CAPS *currentVideoCaps = (VIDEO_STREAM_CONFIG_CAPS *)(videoCaps + capsSize * i); int closestWidth = MAX(currentVideoCaps-&gt;MinOutputSize.cx, MIN(currentVideoCaps-&gt;MaxOutputSize.cx, width)); int closestHeight = MAX(currentVideoCaps-&gt;MinOutputSize.cy, MIN(currentVideoCaps-&gt;MaxOutputSize.cy, height)); int pixelDiff = ABS(desiredPixelCount - closestWidth * closestHeight); if (pixelDiff &lt; bestFitPixelDiff &amp;&amp; ABS(desiredAspectRatio - (float)closestWidth / (float)closestHeight) &lt; 0.1f) { bestFit = i; bestFitPixelDiff = pixelDiff; bestFitWidth = closestWidth; bestFitHeight = closestHeight; } } if (bestFit == -1) goto ReleaseDataAndFail; AM_MEDIA_TYPE *mediaType; hr = pStreamConfig->GetFormat(&mediaType); if (FAILED(hr)) goto ReleaseDataAndFail; VIDEOINFOHEADER *videoInfoHeader = (VIDEOINFOHEADER *)mediaType->pbFormat; videoInfoHeader->bmiHeader.biWidth = bestFitWidth; videoInfoHeader->bmiHeader.biHeight = bestFitHeight; //mediaType->subtype = MEDIASUBTYPE_RGB32; hr = pStreamConfig->SetFormat(mediaType); if (FAILED(hr)) goto ReleaseDataAndFail; pStreamConfig->Release(); pStreamConfig = NULL; free(videoCaps); videoCaps = NULL; free(mediaTypeArray); mediaTypeArray = NULL; // Connect pins IPin *pDeviceOut = NULL, *pGrabberIn = NULL; if (FindPin(pDeviceFilter, PINDIR_OUTPUT, 0, &pDeviceOut) && FindPin(pGrabberF, PINDIR_INPUT, 0, &pGrabberIn)) { hr = pGraphBuilder->Connect(pDeviceOut, pGrabberIn); if (FAILED(hr)) goto ReleaseDataAndFail; } else { goto ReleaseDataAndFail; } // start playing hr = pMediaControl->Run(); if (FAILED(hr)) goto ReleaseDataAndFail; hr = pGrabber->GetConnectedMediaType(&mt); // Set dimensions width = bestFitWidth; height = bestFitHeight; _width = bestFitWidth; _height = bestFitHeight; // Allocate pixel buffer pPixelBuffer = (unsigned *)malloc(width * height * 4); // Release objects pGraphBuilder->Release(); pGraphBuilder = NULL; pEnumMoniker->Release(); pEnumMoniker = NULL; pCreateDevEnum->Release(); pCreateDevEnum = NULL; return true;

    Read the article

  • Macbook Pro suddenly lagging video playback + Flash sites

    - by Mathias
    I have a Macbook Pro, OSx Lion, Intel Core2 Duo, 4GB Ram, NVidia Geforce 8600M GT 128 MB Ram, Intel x25m SSD. Approximately 4 years old. I've been running Flash sites and playing videos without any problems for years. Then suddenly 3 months ago, a flash site like http://thefwa.com is lagging in all browsers. Even mouseover animations - anything. Also video playback in e.g VLC and Quicktime is now lagging. Same videos I used before, I tried installing an older version of VLC without any luck. Playing back video in VLC utilizes the CPU almost 100%, and Flash sites like thefwa.com easily takes up 50-60%. It's as if the hardware acceleration stopped working, or the GPU lost its magic. UPDATE: Same issues also occurred on Snow Leopard Has anyone experienced something similar, or do you know what might be wrong?

    Read the article

  • Out of sync audio / video on Hackintosh

    - by user22902
    I have a PC with OSx86 (10.6.2) on it. Under Leopard my videos worked great, but now with VLC the audio is all garbled and video is way too fast. In Quicktime X video is too fast. MPlayer OSX Extended plays videos fine, but doesn't support many codecs. I have a Geforce 9800 GTX with qe ci ocl... If theres no solution, then are there any other players for OS X that support a lot of codecs? I don't like Windows so that's not an option. Thanks

    Read the article

  • How to get better video quality in Lync?

    - by sinned
    I want to use a ConferenceCam from Logitech to stream talks live via Lync. When I view the RAW webcam image via VLC, the quality is very good (but the latency is high because of buffering). However, when I stream it using Lync, the video gets blurry. Is there a way to ensure QoS in Lync or otherwise improve the video quality to (near-)native? I would rather have some dropped frames than a lower resolution where I can't read the slides. In my setup, I use Lync with an Office365-E3 contract, so I have no Lync-Server in my network. I thought about replacing Lync completely with VLC, but I first want to try Lync because VLC will probably cause firewall issues. Also, I haven't looked up the VLC parameters for less buffering, faster encoding, a bit lower resolution (natively it's more than HD) and streaming.

    Read the article

  • samsung HMX-H100P camcorder and video encoding with mencoder

    - by jskg
    Hi everyone, my background is totally not related to video stuff so pardon my newbie style. I own a samsung HMX-H100P camcorder and I'm trying to encode videos to be uploaded to Youtube and Vimeo. First problem: videos generated by the camera with no processing appear like this: http://www.youtube.com/watch?v=AANbl_DTuzE when I play them with Totem(Linux) or VideoLan. Second problem: When I try to encode the videos produced by the camera using mencoder I get the video at the resolution I chose but those ugly lines and lagging are still present. Here's the command I use: mencoder $inputFile -aspect 16:9 -of lavf -lavfopts format=psp -oac lavc -ovc lavc -lavcopts aglobal=1:vglobal=1:coder=0:vcodec=libx264:acodec=libfaac:vbitrate=4500:abitrate=128 -vf scale=1280:720 -ofps 25000/1001 -o $outputFile Any ideas? Thanks in advance

    Read the article

  • Streaming flash video does not work on my Mac OS X

    - by dehmann
    Flash videos do not work properly on my Mac. On this Vimeo video, for example, it shows only the beginning frame, and audio stutters like crazy, playing audio for a quarter second or so, then silence, then playing again, etc. I have Flash version 10,0,42,34 on Mac OS 10.5.8. It's a PowerBook G4 (PPC). I tried it in Firefox 3.5.5 and Safari 4.0.3. I tried reinstalling Flash, restarting the computer, and using a fresh user profile in Firefox (so that no extensions are interfering with the site), loading the video fully before playing, but nothing helps. I noticed that youtube videos work better, once loaded enough, although the picture does halt briefly once every 10 or so seconds, even when it's fully loaded.

    Read the article

< Previous Page | 28 29 30 31 32 33 34 35 36 37 38 39  | Next Page >