Search Results

Search found 13589 results on 544 pages for 'video player'.

Page 41/544 | < Previous Page | 37 38 39 40 41 42 43 44 45 46 47 48  | Next Page >

  • Keeping player aligned to grid in Pacman

    - by user17577
    I am making a Pacman game using XNA. The game is tile based, with each tile being 32 pixels. As the player moves, I need to know whenever it is perfectly on a tile (ie position of 32, 64, etc...) so that I can check to see if the next tile is free. I am using the following logic to test this. if (position.X % 32 == 0 && position.Y %32 == 0) { onTile = true; } I figure that I need to make the player's speed evenly divide 32. Everything works fine if I make the player's speed an integer such as 4 or 8. But if I make the speed something like 6.4, I end up with positions such as 64.00001, and my if statement no longer works correctly. How can I keep the player aligned with the grid, while allowing a wider range of player speeds than 1, 2, 4, 8, 16, and 32? Or is there some better way to go about this? Thanks

    Read the article

  • How to Install Windows Media Player 11 on Wine

    - by namkid
    I am battling to install Windows Media Player 11 on Wine. I have tried the following: Open a Terminal (Applications, Accessories, Terminal) and type "sudo apt-get install wine." This installs Wine Windows Emulator, a free application that allows you to run many Windows programs within Linux. Download Windows Media Player 11 for Windows XP (link in Resources) and save it to Ubuntu's desktop. Once downloaded, right-click and select "Open with Wine Windows Emulator." Follow the on-screen prompts for installing it to your system. Go to "Applications" then "Wine," select "Programs" and open "Windows Media Player." Click "File" then "Open" and locate a DRM file you want to play. Select "OK" to load it into Media Player. I installed Wine (Which is step 1). But I am having problems with step 2 (Download Windows Media Player 11 for Windows XP (link in Resources) and save it to Ubuntu's desktop). I'm just not finding a way to do it. I may be overlooking the (link in Resources) Can't find it. I am stuck!

    Read the article

  • Is having a [high-end] video card important on a server?

    - by Patrick
    My application is quite interactive application with lots of colors and drag-and-drop functionality, but no fancy 3D-stuff or animations or video, so I only used plain GDI (no GDI Plus, No DirectX). In the past my applications ran in desktops or laptops, and I suggested my customers to invest in a decent video card, with: a minimum resolution of 1280x1024 a minimum color depth of 24 pixels X Megabytes of memory on the video card Now my users are switching more and more to terminal servers, therefore my question: What is the importance of a video card on a terminal server? Is a video card needed anyway on the terminal server? If it is, is the resolution of the remote desktop client limited to the resolutions supported by the video card on the server? Can the choice of a video card in the server influence the performance of the applications running on the terminal server (but shown on a desktop PC)? If I start to make use of graphical libraries (like Qt) or things like DirectX, will this then have an influence on the choice of video card on the terminal server? Are calculations in that case 'offloaded' to the video card? Even on the terminal server? Thanks.

    Read the article

  • Unity Player Controls Streaming Music Services From Chrome Toolbar

    - by Jason Fitzpatrick
    Chrome: If you’re a frequent Pandora, Grooveshark, or other popular streaming music station listener, Unity Player puts play control and song info on the Chrome Toolbar. Rather than sending you digging through your tabs to find the window with Pandora–or Google Music, Grooveshark, 8Tracks, Hypemachine, or any of the other dozen supported services–Unity Player pulls up a one-click control panel for easy pause/play, skip, and access to other service features like thumbs up/down flagging. Unity Player is free, works wherever Chrome does. Unity Player [via Addictive Tips] Make Your Own Windows 8 Start Button with Zero Memory Usage Reader Request: How To Repair Blurry Photos HTG Explains: What Can You Find in an Email Header?

    Read the article

  • Draw Bug 2D player Camera

    - by RedShft
    I have just implemented a 2D player camera for my game, everything works properly except the player on the screen jitters when it moves between tiles. What I mean by jitter, is that if the player is moving the camera updates the tileset to be drawn and if the player steps to the right, the camera snaps that way. The movement is not smooth. I'm guessing this is occurring because of how I implemented the function to calculate the current viewable area or how my draw function works. I'm not entirely sure how to fix this. This camera system was entirely of my own creation and a first attempt at that, so it's very possible this is not a great way of doing things. My camera class, pulls information from the current tileset and calculates the viewable area. Right now I am targettng a resolution of 800 by 600. So I try to fit the appropriate amount of tiles for that resolution. My camera class, after calculating the current viewable tileset relative to the players location, returns a slice of the original tileset to be drawn. This tileset slice is updated every frame according to the players position. This slice is then passed to the map class, which draws the tile on screen. //Map Draw Function //This draw function currently matches the GID of the tile to it's location on the //PNG file of the tileset and then draws this portion on the screen void Draw(SDL_Surface* background, int[] _tileSet) { enforce( tilesetImage != null, "Tileset is null!"); enforce( background != null, "BackGround is null!"); int i = 0; int j = 0; SDL_Rect DestR, SrcR; SrcR.x = 0; SrcR.y = 0; SrcR.h = 32; SrcR.w = 32; foreach(tile; _tileSet) { //This code is matching the current tiles ID to the tileset image SrcR.x = cast(short)(tileWidth * (tile >= 11 ? (tile - ((tile / 10) * 10) - 1) : tile - 1)); SrcR.y = cast(short)(tileHeight * (tile > 10 ? (tile / 10) : 0)); //Applying the tile to the surface SDL_BlitSurface( tilesetImage, &SrcR, background, &DestR ); //this keeps track of what column/row we are on i++; if ( i == mapWidth ) { i = 0; j++; } DestR.x = cast(short)(i * tileWidth); DestR.y = cast(short)(j * tileHeight); } } //Camera Class class Camera { private: //A rectangle representing the view area SDL_Rect viewArea; //In number of tiles int viewAreaWidth; int viewAreaHeight; //This is the x and y coordinate of the camera in MAP SPACE IN PIXELS vect2 cameraCoordinates; //The player location in map space IN PIXELS vect2 playerLocation; //This is the players location in screen space; vect2 playerScreenLoc; int playerTileCol; int playerTileRow; int cameraTileCol; int cameraTileRow; //The map is stored in a single array with the tile ids //this corresponds to the index of the starting and ending tile int cameraStartTile, cameraEndTile; //This is a slice of the current tile set int[] tileSetCopy; int mapWidth; int mapHeight; int tileWidth; int tileHeight; public: this() { this.viewAreaWidth = 25; this.viewAreaHeight = 19; this.cameraCoordinates = vect2(0, 0); this.playerLocation = vect2(0, 0); this.viewArea = SDL_Rect (0, 0, 0, 0); this.tileWidth = 32; this.tileHeight = 32; } void Init(vect2 playerPosition, ref int[] tileSet, int mapWidth, int mapHeight ) { playerLocation = playerPosition; this.mapWidth = mapWidth; this.mapHeight = mapHeight; CalculateCurrentCameraPosition( tileSet, playerPosition ); //writeln( "Tile Set Copy: ", tileSetCopy ); //writeln( "Orginal Tile Set: ", tileSet ); } void CalculateCurrentCameraPosition( ref int[] tileSet, vect2 playerPosition ) { playerLocation = playerPosition; playerTileCol = cast(int)((playerLocation.x / tileWidth) + 1); playerTileRow = cast(int)((playerLocation.y / tileHeight) + 1); //writeln( "Player Tile (Column, Row): ","(", playerTileCol, ", ", playerTileRow, ")"); cameraTileCol = playerTileCol - (viewAreaWidth / 2); cameraTileRow = playerTileRow - (viewAreaHeight / 2); CameraMapBoundsCheck(); //writeln( "Camera Tile Start (Column, Row): ","(", cameraTileCol, ", ", cameraTileRow, ")"); cameraStartTile = ( (cameraTileRow - 1) * mapWidth ) + cameraTileCol - 1; //writeln( "Camera Start Tile: ", cameraStartTile ); cameraEndTile = cameraStartTile + ( viewAreaWidth * viewAreaHeight ) * 2; //writeln( "Camera End Tile: ", cameraEndTile ); tileSetCopy = tileSet[cameraStartTile..cameraEndTile]; } vect2 CalculatePlayerScreenLocation() { cameraCoordinates.x = cast(float)(cameraTileCol * tileWidth); cameraCoordinates.y = cast(float)(cameraTileRow * tileHeight); playerScreenLoc = playerLocation - cameraCoordinates + vect2(32, 32);; //writeln( "Camera Coordinates: ", cameraCoordinates ); //writeln( "Player Location (Map Space): ", playerLocation ); //writeln( "Player Location (Screen Space): ", playerScreenLoc ); return playerScreenLoc; } void CameraMapBoundsCheck() { if( cameraTileCol < 1 ) cameraTileCol = 1; if( cameraTileRow < 1 ) cameraTileRow = 1; if( cameraTileCol + 24 > mapWidth ) cameraTileCol = mapWidth - 24; if( cameraTileRow + 19 > mapHeight ) cameraTileRow = mapHeight - 19; } ref int[] GetTileSet() { return tileSetCopy; } int GetViewWidth() { return viewAreaWidth; } }

    Read the article

  • Moving the jBullet collision body to with the player object

    - by Kenneth Bray
    I am trying to update the location of the rigid body for a player class, as my player moves around I would like the collision body to also move with the player object (currently represented as a cube). Below is my current update method for when I want to update the xyz coords, but I am pretty sure I am not able to update the origin coords? : public void Update(float pX, float pY, float pZ) { posX = pX; posY = pY; posZ = pZ; //update the playerCube transform for the rigid body cubeTransform.origin.x = posX; cubeTransform.origin.y = posY; cubeTransform.origin.z = posZ; cubeRigidBody.getMotionState().setWorldTransform(cubeTransform); processTransformMatrix(cubeTransform); } I do not have rotation updated, as I do not actually want/need the player body to rotate at all currently. However, in the final game this will me put in place.

    Read the article

  • AndEngine player, background and camera

    - by valdemar593
    I'm developing a 2D shooter using AndEngine. At the moment I'm trying to make the camera follow the player. As I've understood the common approach is to use the SmoothCamera zooming it and setting the chased entity. The problem is that the camera follows the player WITH background moving also (RepeatingSpriteBackground), so it looks like the player doesn't move at all though the actual position changes. So I don't really get how to make the camera follow the player and have the background not moving. Thanks in advance.

    Read the article

  • Windows Media Player Vulnerability, PCAnywhere Warning

    Windows Media Player Vulnerability Targeted by Drive-by-download Attack Security firm Trend Micro recently released details on malware that has been targeting the MIDI Remote Code Execution Vulnerability found in Microsoft's Windows Media Player. A post on Trend Micro's Malware Blog offered further insight into the malware that has been exploiting the CVE-2012-0003 vulnerability. The malware's authors have been successful in exploiting the vulnerability by tricking unsuspecting victims into opening a specially engineered MIDI file in Windows Media Player. This Web-based drive-by-download ...

    Read the article

  • Make the player run onto stairs smoothly

    - by misiMe
    I have a 2D Platform game, where the player Always runs to the right, but the terrain isn't Always horizontal. Example: I implemented a bounding-box collision system that just checks for intersections with player box and the other blocks, to stop player from running if you encounter a big block, so that you have to jump, but when I put stairs, I want him to run smoothly just like he is on the horizontal ground. With the collision system you have to jump the stairs in order to pass them! I thought about generating a line between the edges of the stairs, and imposing the player movement on that line... What do you think? Is there something more clever to do?

    Read the article

  • Music Players in Ubuntu/Linux [closed]

    - by v2r
    Music Player, just like Web Browsers are an important part of today's application repertoire, and not only for entertainment reason. Having tried a few Linux Player over the past years i come to wonder, which Players you prefer and what kind of Players are out there, that you suggest are worth looking into and why!! I used Rhythmbox for a long time, but "Jamendo and Magnatune" plugin are both no longer available in 11.10 and also my covers are not shown, since i stream my music folder from a second partition. aTunes is another great Player, but there are also some flaws which i contacted the developers about. It would be nice if you post some alternatives! --Thank you. Example: Name of Player: aTunes | Homepage Additional Info : aTunes is a java-based Music-Player for Linux/Unix/Windows and ... Only one player-example per answer, please!!

    Read the article

  • Managing shots of the player

    - by Bitbridge
    I'm currently developing a 2D Jump'n'Run and the situation is the following: The player has different weapons he can collect and is then able to shoot the weapon's projectiles (laser, rockets, whatever). In my previous game (space shooter) I just had a manager class for all the weapon-shots, it stored them in a container and then updates and draws every single one. When the "shoot-event" occurred, the "ProjectileManager" was notified and it added the wanted projectile. The input for player action is handled in the player-class, so the player would have to know the manager to call the function of the manager. I also have a collisionManager, that checks for collisions between, for example, enemies and the projectiles and then notifies these objects. However, I somehow have the feeling, that I shouldn't use this approach and that there might be a better way to handle this. I know, the question is a bit vague, I'm pretty much just looking for input and ideas to improve my design.

    Read the article

  • How can I scale an OSMF player in ActionScript 3/Flex

    - by Greg Hinch
    I am trying to create a simple video player SWF using the open source media framework in Flex 4. I want to make it dynamically scale based on the dimensions of the video, input by the user. I am following the directions on the Adobe help site, but the video does not seem to scale properly. Depending on the size, sometimes videos play larger than the space allotted on the webpage, and sometimes smaller. The only way I have been able to get it to work properly is by including a SWF metadata tag hardcoding the width and height, but I can't use that if I want to make the player dynamically sized. My code is : package { import flash.display.Sprite; import flash.events.Event; import org.osmf.media.MediaElement; import org.osmf.media.MediaPlayer; import org.osmf.media.URLResource; import org.osmf.containers.MediaContainer; import org.osmf.elements.VideoElement; import org.osmf.layout.LayoutMetadata; public class GalleryVideoPlayer extends Sprite { private var videoElement:VideoElement; private var mediaPlayer:MediaPlayer; private var mediaContainer:MediaContainer; private var flashVars:Object; public function GalleryVideoPlayer() { if (stage) init(); else addEventListener(Event.ADDED_TO_STAGE, init); } private function init(e:Event = null):void { removeEventListener(Event.ADDED_TO_STAGE, init); flashVars = loaderInfo.parameters; mediaPlayer = new MediaPlayer(); videoElement = new VideoElement(new URLResource(flashVars.file)); mediaContainer = new MediaContainer(); var layoutMetadata:LayoutMetadata = new LayoutMetadata(); layoutMetadata.width = Number(flashVars.width); layoutMetadata.height = Number(flashVars.height); videoElement.addMetadata(LayoutMetadata.LAYOUT_NAMESPACE, layoutMetadata); mediaPlayer.media = videoElement; mediaContainer.addMediaElement(videoElement); addChild(mediaContainer); } }}

    Read the article

  • Drawing a rectangle on a video in C#

    - by Haxed
    Hi I want to draw a rectangle on a video stream(web cam video or loaded saved video) that I have streaming on a picture box. This is a C# application and I am using EmguCV 2.1.0.0. I have been successful in displaying the video stream on the picturebox in the form. Can I use Emgucv to draw on the video or should I use something else ? Can I use Dshownet or something like that ? Thanks for taking the time to read this. Many Thanks

    Read the article

  • android: How to play YouTube video on emulator

    - by Kaillash
    Could someone help me in 1. is this possible to play video from YouTube on emulator using YouTube video link (like:http://www.youtube.com/watch?v=T1Wgp3mLa_E) ? If not, then why? 2.If yes then how ? I tried to play YouTube video using VideoView but got "Command PLAYER_INIT completed with an error or info PVMFErrCorrupt" message through logcat command.

    Read the article

  • How to turn off Video Acceleration programatically.

    - by Stefan
    Im using the Windows Media Player OCX in a program runned on hundreds of computers (dedicated). I have found out that when video acceleration is turned on to "full", on some computers it will cause the video to fail to play correct, with green squares between movies and so on. Turn the acceleration to "None" and everything is fine. This program is runned on ~800 computers that will autoupdate my program. So I want to add to the startup to my program that it turns off the video acceleration. The question is, how do I turn off video Acceleration Programatically? All computers are running XP and at least the second service pack. It would take me ages to manually logg in to all those computers and change that setting so thats why I want the program to be able to do it automagically for me.

    Read the article

  • Inserting HTML5 video using JavaScript for iPad

    - by Vishal
    Hello, I am trying to insert a video into HTML using jQuery for iPad but all I see is a black screen. If I add the video tag directly to the HTML page all seems to work fine. Here is what I have in my JavaScript and I call this using a function for onClick event. var html = ""; html += '<video id="someVideo"'; html += ' width="'+settings.width+'" height="'+settings.height+'"' html += ' controls="controls">'; html += '<source src="'+url+'" type="video/mp4" />'; html += '</video>'; $("#videoDiv").html(html); Any help will be greatly appreciated Thanks

    Read the article

  • Play local video with an HTML/Javascript based Adobe Air application

    - by Matt Wood
    I'm trying to add some video playback (that will be used for a tutorial) to my Adobe Air application. I'm not a Flex or Flash developer and my application is HTML/Javascript based, so I'm having trouble with some of the video solutions I've been able to find. Here is one of the examples I've found that is flex based: Playing local files with Air I've looked for a free flash video player that I could just embed. But the only one I've found I was unable to coerce to play files from my Air application directory. I was exited at the prospect of using the HTML5 video tag which I thought Air supported, but that also seems to not work. Can anyone recommend a free flash video player that I can embed? Or a solution that doesn't have to be built completely from flex?

    Read the article

  • MP4 video plays on localhost but not after publish

    - by teahou
    I made a video using Camtasia Studio 8. I added it to my MVC App and ran it on local host, video plays fine. I publish to my local dev web server (windows server 2008), the video will not play. Gives no error. I have tried on chrome and IE11. On chrome I checked the network tab and it says "Status - Cancelled". Do I need to make some changes on the web server settings? <video width="320" height="240" controls="controls"> <source src="@Url.Content("~/Content/videos/3_Comments_Letters.mp4")" /> </video>

    Read the article

  • Sound out of sync after merging multiple mp4 files with Avidemux

    - by Goto10
    I am trying to join (merge) two or more .mp4 files together, without re-encoding. Here is what I did: Started Avidemux 2.5.5. With File-Open, selected Input1.mp4. I received this message - "H.264 detected. If the file is using B-frames as reference it can lead to a crash or stuttering. Avidemux can use another mode which is safe but YOU WILL LOOSE SOME FRAME ACCURACY. Do you want to use that mode?". I chose "No". With File-Append, selected Input2.mp4. I received the same "H.264 detected" message again and chose "No". Selected the Format to MP4 (from AVI). Saved the output file (called Output.mp4) with File-Save-Save Video. Unfortunately, when I play the Output.mp4 video in VLC, the sound is out of sync with the second video. How can I correct this?

    Read the article

  • converting to MXF using ffmpeg

    - by Prakash
    I have been trying to use FFmpeg utility to convert a avi file using DNxHD to mxf format. I am using "FFmpeg" with params as following: ffmpeg -i ccvt_box.avi -vcodec dnxhd -video_size 1920x1080 -r 24 -b:v 115m ex.mxf The error it is giving : ffmpeg version N-43737-g76c3fff Copyright (c) 2000-2012 the FFmpeg developers built on Aug 20 2012 18:50:42 with llvm-gcc 4.2.1 (LLVM build 2336.11.00) configuration: libavutil 51. 70.100 / 51. 70.100 libavcodec 54. 53.100 / 54. 53.100 libavformat 54. 25.104 / 54. 25.104 libavdevice 54. 2.100 / 54. 2.100 libavfilter 3. 11.101 / 3. 11.101 libswscale 2. 1.101 / 2. 1.101 libswresample 0. 15.100 / 0. 15.100 Input #0, avi, from 'ccvt_box.avi': Duration: 00:00:10.00, start: 0.000000, bitrate: 691 kb/s Stream #0:0: Video: indeo5 (IV50 / 0x30355649), yuv410p, 340x344, 10 tbr, 10 tbn, 10 tbc Metadata: title : bob.avi [dnxhd @ 0x7fcd60818e00] video parameters incompatible with DNxHD Output #0, mxf, to 'ex.mxf': Stream #0:0: Video: dnxhd, yuv422p, 340x344, q=2-1024, 90k tbn, 24 tbc Metadata: title : bob.avi Stream mapping: Stream #0:0 -> #0:0 (indeo5 -> dnxhd) Error while opening encoder for output stream #0:0 - maybe incorrect parameters such as bit_rate, rate, width or height

    Read the article

  • DirectShow: Video-Preview and Image (with working code)

    - by xsl
    Questions / Issues If someone can recommend me a good free hosting site I can provide the whole project file. As mentioned in the text below the TakePicture() method is not working properly on the HTC HD 2 device. It would be nice if someone could look at the code below and tell me if it is right or wrong what I'm doing. Introduction I recently asked a question about displaying a video preview, taking camera image and rotating a video stream with DirectShow. The tricky thing about the topic is, that it's very hard to find good examples and the documentation and the framework itself is very hard to understand for someone who is new to windows programming and C++ in general. Nevertheless I managed to create a class that implements most of this features and probably works with most mobile devices. Probably because the DirectShow implementation depends a lot on the device itself. I could only test it with the HTC HD and HTC HD2, which are known as quite incompatible. HTC HD Working: Video preview, writing photo to file Not working: Set video resolution (CRASH), set photo resolution (LOW quality) HTC HD 2 Working: Set video resolution, set photo resolution Problematic: Video Preview rotated Not working: Writing photo to file To make it easier for others by providing a working example, I decided to share everything I have got so far below. I removed all of the error handling for the sake of simplicity. As far as documentation goes, I can recommend you to read the MSDN documentation, after that the code below is pretty straight forward. void Camera::Init() { CreateComObjects(); _captureGraphBuilder->SetFiltergraph(_filterGraph); InitializeVideoFilter(); InitializeStillImageFilter(); } Dipslay a video preview (working with any tested handheld): void Camera::DisplayVideoPreview(HWND windowHandle) { IVideoWindow *_vidWin; _filterGraph->QueryInterface(IID_IMediaControl,(void **) &_mediaControl); _filterGraph->QueryInterface(IID_IVideoWindow, (void **) &_vidWin); _videoCaptureFilter->QueryInterface(IID_IAMVideoControl, (void**) &_videoControl); _captureGraphBuilder->RenderStream(&PIN_CATEGORY_PREVIEW, &MEDIATYPE_Video, _videoCaptureFilter, NULL, NULL); CRect rect; long width, height; GetClientRect(windowHandle, &rect); _vidWin->put_Owner((OAHWND)windowHandle); _vidWin->put_WindowStyle(WS_CHILD | WS_CLIPSIBLINGS); _vidWin->get_Width(&width); _vidWin->get_Height(&height); height = rect.Height(); _vidWin->put_Height(height); _vidWin->put_Width(rect.Width()); _vidWin->SetWindowPosition(0,0, rect.Width(), height); _mediaControl->Run(); } HTC HD2: If set SetPhotoResolution() is called FindPin will return E_FAIL. If not, it will create a file full of null bytes. HTC HD: Works void Camera::TakePicture(WCHAR *fileName) { CComPtr<IFileSinkFilter> fileSink; CComPtr<IPin> stillPin; CComPtr<IUnknown> unknownCaptureFilter; CComPtr<IAMVideoControl> videoControl; _imageSinkFilter.QueryInterface(&fileSink); fileSink->SetFileName(fileName, NULL); _videoCaptureFilter.QueryInterface(&unknownCaptureFilter); _captureGraphBuilder->FindPin(unknownCaptureFilter, PINDIR_OUTPUT, &PIN_CATEGORY_STILL, &MEDIATYPE_Video, FALSE, 0, &stillPin); _videoCaptureFilter.QueryInterface(&videoControl); videoControl->SetMode(stillPin, VideoControlFlag_Trigger); } Set resolution: Works great on HTC HD2. HTC HD won't allow SetVideoResolution() and only offers one low resolution photo resolution: void Camera::SetVideoResolution(int width, int height) { SetResolution(true, width, height); } void Camera::SetPhotoResolution(int width, int height) { SetResolution(false, width, height); } void Camera::SetResolution(bool video, int width, int height) { IAMStreamConfig *config; config = NULL; if (video) { _captureGraphBuilder->FindInterface(&PIN_CATEGORY_PREVIEW, &MEDIATYPE_Video, _videoCaptureFilter, IID_IAMStreamConfig, (void**) &config); } else { _captureGraphBuilder->FindInterface(&PIN_CATEGORY_STILL, &MEDIATYPE_Video, _videoCaptureFilter, IID_IAMStreamConfig, (void**) &config); } int resolutions, size; VIDEO_STREAM_CONFIG_CAPS caps; config->GetNumberOfCapabilities(&resolutions, &size); for (int i = 0; i < resolutions; i++) { AM_MEDIA_TYPE *mediaType; if (config->GetStreamCaps(i, &mediaType, reinterpret_cast<BYTE*>(&caps)) == S_OK ) { int maxWidth = caps.MaxOutputSize.cx; int maxHeigth = caps.MaxOutputSize.cy; if(maxWidth == width && maxHeigth == height) { VIDEOINFOHEADER *info = reinterpret_cast<VIDEOINFOHEADER*>(mediaType->pbFormat); info->bmiHeader.biWidth = maxWidth; info->bmiHeader.biHeight = maxHeigth; info->bmiHeader.biSizeImage = DIBSIZE(info->bmiHeader); config->SetFormat(mediaType); DeleteMediaType(mediaType); break; } DeleteMediaType(mediaType); } } } Other methods used to build the filter graph and create the COM objects: void Camera::CreateComObjects() { CoInitialize(NULL); CoCreateInstance(CLSID_CaptureGraphBuilder, NULL, CLSCTX_INPROC_SERVER, IID_ICaptureGraphBuilder2, (void **) &_captureGraphBuilder); CoCreateInstance(CLSID_FilterGraph, NULL, CLSCTX_INPROC_SERVER, IID_IGraphBuilder, (void **) &_filterGraph); CoCreateInstance(CLSID_VideoCapture, NULL, CLSCTX_INPROC, IID_IBaseFilter, (void**) &_videoCaptureFilter); CoCreateInstance(CLSID_IMGSinkFilter, NULL, CLSCTX_INPROC, IID_IBaseFilter, (void**) &_imageSinkFilter); } void Camera::InitializeVideoFilter() { _videoCaptureFilter->QueryInterface(&_propertyBag); wchar_t deviceName[MAX_PATH] = L"\0"; GetDeviceName(deviceName); CComVariant comName = deviceName; CPropertyBag propertyBag; propertyBag.Write(L"VCapName", &comName); _propertyBag->Load(&propertyBag, NULL); _filterGraph->AddFilter(_videoCaptureFilter, L"Video Capture Filter Source"); } void Camera::InitializeStillImageFilter() { _filterGraph->AddFilter(_imageSinkFilter, L"Still image filter"); _captureGraphBuilder->RenderStream(&PIN_CATEGORY_STILL, &MEDIATYPE_Video, _videoCaptureFilter, NULL, _imageSinkFilter); } void Camera::GetDeviceName(WCHAR *deviceName) { HRESULT hr = S_OK; HANDLE handle = NULL; DEVMGR_DEVICE_INFORMATION di; GUID guidCamera = { 0xCB998A05, 0x122C, 0x4166, 0x84, 0x6A, 0x93, 0x3E, 0x4D, 0x7E, 0x3C, 0x86 }; di.dwSize = sizeof(di); handle = FindFirstDevice(DeviceSearchByGuid, &guidCamera, &di); StringCchCopy(deviceName, MAX_PATH, di.szLegacyName); } Full header file: #ifndef __CAMERA_H__ #define __CAMERA_H__ class Camera { public: void Init(); void DisplayVideoPreview(HWND windowHandle); void TakePicture(WCHAR *fileName); void SetVideoResolution(int width, int height); void SetPhotoResolution(int width, int height); private: CComPtr<ICaptureGraphBuilder2> _captureGraphBuilder; CComPtr<IGraphBuilder> _filterGraph; CComPtr<IBaseFilter> _videoCaptureFilter; CComPtr<IPersistPropertyBag> _propertyBag; CComPtr<IMediaControl> _mediaControl; CComPtr<IAMVideoControl> _videoControl; CComPtr<IBaseFilter> _imageSinkFilter; void GetDeviceName(WCHAR *deviceName); void InitializeVideoFilter(); void InitializeStillImageFilter(); void CreateComObjects(); void SetResolution(bool video, int width, int height); }; #endif

    Read the article

< Previous Page | 37 38 39 40 41 42 43 44 45 46 47 48  | Next Page >