Search Results

Search found 13613 results on 545 pages for 'totem video player'.

Page 195/545 | < Previous Page | 191 192 193 194 195 196 197 198 199 200 201 202  | Next Page >

  • Unable to extend desktop

    - by CSharperWithJava
    I'm trying to hook up my TV to my computer as a gaming/multimedia center but I'm having troubles setting it up. I have a custom built machine running Windows 7 RC. It has an ATI Radeon 4800 video card with 2 dvi output and 1 S-video output. I have an s-video to composite adapter that connects to my tv. (It's an old TV with only Cable, composite, and s-video connections). I can switch the desktop to my TV without a problem, but I can't duplicate or extend my desktop onto it. I've installed the latest drivers and Catalyst Control Center, but it won't let it work any more readily than Windows would. Any suggestions? Would using an s-video cable instead of the adapter change anything? (The only reason I use the adapter is because it came with the graphics card) (Edit) I installed the latest drivers and I can now duplicate the screen (show on one monitor and on the TV), but I still can't extend the desktop.

    Read the article

  • Bounding Box Collision Glitching Problem (Pygame)

    - by Ericson Willians
    So far the "Bounding Box" method is the only one that I know. It's efficient enough to deal with simple games. Nevertheless, the game I'm developing is not that simple anymore and for that reason, I've made a simplified example of the problem. (It's worth noticing that I don't have rotating sprites on my game or anything like that. After showing the code, I'll explain better). Here's the whole code: from pygame import * DONE = False screen = display.set_mode((1024,768)) class Thing(): def __init__(self,x,y,w,h,s,c): self.x = x self.y = y self.w = w self.h = h self.s = s self.sur = Surface((64,48)) draw.rect(self.sur,c,(self.x,self.y,w,h),1) self.sur.fill(c) def draw(self): screen.blit(self.sur,(self.x,self.y)) def move(self,x): if key.get_pressed()[K_w] or key.get_pressed()[K_UP]: if x == 1: self.y -= self.s else: self.y += self.s if key.get_pressed()[K_s] or key.get_pressed()[K_DOWN]: if x == 1: self.y += self.s else: self.y -= self.s if key.get_pressed()[K_a] or key.get_pressed()[K_LEFT]: if x == 1: self.x -= self.s else: self.x += self.s if key.get_pressed()[K_d] or key.get_pressed()[K_RIGHT]: if x == 1: self.x += self.s else: self.x -= self.s def warp(self): if self.y < -48: self.y = 768 if self.y > 768 + 48: self.y = 0 if self.x < -64: self.x = 1024 + 64 if self.x > 1024 + 64: self.x = -64 r1 = Thing(0,0,64,48,1,(0,255,0)) r2 = Thing(6*64,6*48,64,48,1,(255,0,0)) while not DONE: screen.fill((0,0,0)) r2.draw() r1.draw() # If not intersecting, then moves, else, it moves in the opposite direction. if not ((((r1.x + r1.w) > (r2.x - r1.s)) and (r1.x < ((r2.x + r2.w) + r1.s))) and (((r1.y + r1.h) > (r2.y - r1.s)) and (r1.y < ((r2.y + r2.h) + r1.s)))): r1.move(1) else: r1.move(0) r1.warp() if key.get_pressed()[K_ESCAPE]: DONE = True for ev in event.get(): if ev.type == QUIT: DONE = True display.update() quit() The problem: In my actual game, the grid is fixed and each tile has 64 by 48 pixels. I know how to deal with collision perfectly if I moved by that size. Nevertheless, obviously, the player moves really fast. In the example, the collision is detected pretty well (Just as I see in many examples throughout the internet). The problem is that if I put the player to move WHEN IS NOT intersecting, then, when it touches the obstacle, it does not move anymore. Giving that problem, I began switching the directions, but then, when it touches and I press the opposite key, it "glitches through". My actual game has many walls, and the player will touch them many times, and I can't afford letting the player go through them. The code-problem illustrated: When the player goes towards the wall (Fine). When the player goes towards the wall and press the opposite direction. (It glitches through). Here is the logic I've designed before implementing it: I don't know any other method, and I really just want to have walls fixed in a grid, but move by 1 or 2 or 3 pixels (Slowly) and have perfect collision without glitching-possibilities. What do you suggest?

    Read the article

  • Microsoft Lifecam VX-2000 doesn't work anymore in Cheese

    - by paed808
    I got have two Lifecam VX-2000's and they don't work in cheese anymore. I don't know if it's a problem with a missing package, or a package I installed. Here is the output. (cheese:11122): Clutter-WARNING **: No listener with the specified listener id 29 (cheese:11122): Clutter-WARNING **: No listener with the specified listener id 30 (cheese:11122): Clutter-WARNING **: No listener with the specified listener id 31 (cheese:11122): Clutter-WARNING **: No listener with the specified listener id 32 (cheese:11122): GLib-CRITICAL **: g_hash_table_remove_internal: assertion `hash_table != NULL' failed (cheese:11122): Clutter-WARNING **: Not able to remove listener with id 1 (cheese:11122): GLib-CRITICAL **: g_hash_table_size: assertion `hash_table != NULL' failed totem-video-thumbnailer: 'file:///home/myusername/Videos/Webcam/2012-09-20- 191530.webm' isn't thumbnailable Reason: Media contains no supported video streams. ** (cheese:11122): WARNING **: could not generate thumbnail for /home/myusername/Videos/Webcam/2012-09-20-191530.webm (video/webm) Notice the: Reason: Media contains no supported video streams. When I try to record a video it just makes a 13.2KB WEBM file with nothing. When I take a picture it works. Edit: I've been thinking that the problem started after installing the MediUbuntu repository on my system.

    Read the article

  • New: ZFS Storage Appliance Videos

    - by Roxana Babiciu
    Check out part one of a new video series for ZFS Storage Appliance. In video #1, you’ll learn about the advantages built into Oracle’s ZS3 Storage Appliance that come from the unique position that Oracle holds in the market. In video #2, you’ll learn how best to monitor large ZS3 installations as well as the use of Enterprise Manager as a complement to dtrace analytics at the ZFS Storage Appliance device level.

    Read the article

  • JavaScript tip a day: More Debugging Tricks

    - by Sahil Malik
    SharePoint, WCF and Azure Trainings: more information Debugging is a pain. Debugging events on a web page is an especially bigger pain. This video will make that pain go away! This video will show you $ keywords, debugger statement, conditional breakpoints, monitoring events, global error handling etc. Make sure you check out the debugging video from yesterday too. Read full article ....

    Read the article

  • aplay -l says no soundcards found; alsaconf says no supported cords; yet /proc/asound contains cards

    - by nimasmi
    I am trying to get HDMI output using a Gainward Nvidia 210 512 MB on Ubuntu 10.04 Lucid Lynx. I have upgraded alsa-driver, alsa-lib and alsa-utils to 1.0.24 by building from source, thanks to this blog post. Some relevant output... user@box:~$ lspci | grep Audio 00:05.0 Audio device: nVidia Corporation MCP61 High Definition Audio (rev a2) 01:09.0 Multimedia video controller: Conexant Systems, Inc. CX23880/1/2/3 PCI Video and Audio Decoder (rev 05) 01:09.2 Multimedia controller: Conexant Systems, Inc. CX23880/1/2/3 PCI Video and Audio Decoder [MPEG Port] (rev 05) 01:09.4 Multimedia controller: Conexant Systems, Inc. CX23880/1/2/3 PCI Video and Audio Decoder [IR Port] (rev 05) 02:00.1 Audio device: nVidia Corporation High Definition Audio Controller (rev a1) user@box:~$ cat /proc/asound/version Advanced Linux Sound Architecture Driver Version 1.0.24. Compiled on Sep 15 2012 for kernel 2.6.32-42-generic (SMP). user@box:~$ ls /proc/asound` card0 cards hwdep NVidia oss seq version card1 devices modules NVidia_1 pcm timers user@box:~$ aplay -l aplay: device_list:240: no soundcards found... user@box:~$ sudo /sbin/alsa-utils start * Setting up ALSA... * warning: 'alsactl restore' failed with error message 'alsactl: set_control:1403: Cannot write control '2:0:0:IEC958 Playback Default:0' : Operation not permitted'... amixer: Invalid command! ...done. Any help appreciated. PS my video card is connected only through the PCI-E slot. I assume there is no extra audio connection required.

    Read the article

  • JS / HTML 5 Compatablity issue on iOS 6

    - by Dhaval
    I'm using HTML 5 to play video and there are some content before the video so I'm using flexroll to scroll that whole window. I'm checking it on iPad, now problem is that in iOS 5 its working fine but when I update to iOS 6 then screen is not scrolling only video is scroll up and down, content is as it is in the position. I can't understand what is the exact problem. Is that js compatibility issue or HTML 5 video compatibility issue. Can anyone please help me to figure out, your help will really be appreciated.

    Read the article

  • The first in-depth technical analysis of VP8

    <b>Diary Of An x264 Developer:</b> "Back in my original post about Internet video, I made some initial comments on the hope that VP8 would solve the problems of web video by providing a supposed patent-free video format with significantly better compression than the current options of Theora and Dirac."

    Read the article

  • What are general guidelines and advices to estimate how much you should charge for a project, being a novice freelancer?

    - by Dokkat
    I am an experienced programmer but completely new to the market. Someone wants me to do a project for them, but I do not know how much it is worth. What are general guidelines/advices for finding what a project is worth on the market? If I can ask here about this particular one, it is a HTML5 site with a login/register form and a video player that has to play a lecture video and powerpoint slides synchronized. They'll give me the video, the audio the powerpoints. I should also do some editting on the video before.

    Read the article

  • Running crysis 2 on ubuntu gives me an error message

    - by ShajD
    so i recently installed ubuntu alongside windows 7 and i installed crysis 2 with wine. crysis 2 works fine when i run windows, however when i run it using wine in ubuntu cryengine gives me a message saying, "Unsupported video card detected! Continuing to run migth lead to unexpected results or crashes....." i've got two video cards ones an intel and the other's a nvidia. i typed lspci into the terminal and my nvidia card was listed under video controller as well

    Read the article

  • SnagIt

    A feature-laden screen capture tool for easily copying and sharing any image, text, or video

    Read the article

  • UVC device Logitech WebCam 9000 pro

    - by Pavel
    There is such a good webcam in universe that acts as a "USB Video Class" (UVC - video USB standart interface) - the logitech webcam 9000. UVC offer unified interface allowing to control it or grab a picture from it by any UVC-driver. You need one universal driver and you support all the UVC devices (webcams, video-cameras, video-grabbing-cards etc). For example, in linux - if you have UVC driver - you don't need to think about specific webcam driver for UVC webcam. UVC has unified way that webcam transmit its awailable resolutions and other capabilities, so i see 1600x1200 resolution without any problem. I wonder if windows 7 has UVC. I mean "universal UVC" (-; It says "USB Video Class", but doesn't give resolutions larger than 640x480 and webcam's controls, like 'sharpness', 'focus' and other as linux's driver does...

    Read the article

  • Rendering trillions of "atoms" instead of polygons?

    - by Baring
    I just saw a video about what the publishers call the "next major step after the invention of 3D". According to the person speaking in it, they use a huge amount of atoms grouped into clouds instead of polygons, to reach a level of unlimited detail. They tried their best to make the video understandable for persons with no knowledge of any rendering techniques, and therefore or for other purposes left out all details of how their engine works. The level of detail in their video does look quite impressive to me. How is it possible to render scenes using custom atoms instead of polygons on current hardware? (Speed, memory-wise) If this is real, why has nobody else even thought about it so far? I'm, as an OpenGL developer, really baffled by this and would really like to hear what experts have to say. Therefore I also don't want this to look like a cheap advert and will include the link to the video only if requested, in the comments section.

    Read the article

  • Where should I put zoomIn in my MapActivity?

    - by Johny
    I'm writing an Android app, and I'd like to zoomIn as soon as the map has been loaded. I get the following error: java.lang.IllegalArgumentException: width and height must be > 0 This MapActivity - width and height must be > 0 question suggests the problem is the zoomIn() method is in the onCreate() method. But I get same error when I put it in the onResume() method. I've been searching for hours and I can't find anything about it at http://developer.android.com or anywhere else... Also I can't find a way to get the time point the map has been loaded. A "MapLoadedListener" or something like that... EDIT Here is my code: public class AMap extends MapActivity{ private final String LOG_TAG = this.getClass().getSimpleName(); private Context mContext; private Chronometer timer; private TextView tvCountdown; private RelativeLayout rl; private MapView mapView; private MapController mapController; private List<Overlay> mapOverlays; private PlayersOverlay playersOverlay; private Drawable drawable; private Builder endDialog; private ContextThemeWrapper ctw; private Handler mHandler = new Handler(); private Player player = new Player(); private StartTask startTask; private EndTask endTask; private MyDBAdapter mdba; private Cursor playersCursor; private UpdateBroadcastReceiver r; @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.map_view); mContext = AMap.this; // set map mapView = (MapView) findViewById(R.id.mapview); mapView.setBuiltInZoomControls(true); mapView.setFocusable(true); // find the relative layout rl = (RelativeLayout) findViewById(R.id.rl); // set the chronometer timer = (Chronometer) findViewById(R.id.tv_timer); timer.setBackgroundColor(Color.DKGRAY); // set the countdown textview tvCountdown = (TextView) findViewById(R.id.tv_countdown); // Open DB connection and get players Cursor mdba = new MyDBAdapter(mContext); mdba.open(); playersCursor = mdba.getGame(); // Get this player's id and location Intent starter = this.getIntent(); player.setId(starter.getIntExtra("id", 0)); player.setLatitude(starter.getDoubleExtra("lat", 0)); player.setLongitude(starter.getDoubleExtra("lon", 0)); // Set this player's location as map's center GeoPoint geoPoint = new GeoPoint((int) (player.getLatitude()*1E6), (int) (player.getLongitude()*1E6)); mapController = mapView.getController(); mapController.setCenter(geoPoint); mapController.setZoom(15); Log.d(LOG_TAG, "My playersCursor has "+playersCursor.getCount()+" rows"); // drawable is needed but not used drawable = this.getResources().getDrawable(R.drawable.ic_launcher); // set PlayersOverlay (locations and statuses) playersOverlay = new PlayersOverlay(player.getId(), playersCursor, drawable, this); mapOverlays = mapView.getOverlays(); mapOverlays.add(playersOverlay); mHandler.postDelayed(mUpdateTimeTask, 100); } private Runnable mUpdateTimeTask = new Runnable() { public void run() { int h = mapView.getLayoutParams().height; int w = mapView.getLayoutParams().width; Log.d(LOG_TAG, "w = "+w+" , h = "+h); mHandler.postAtTime(this, System.currentTimeMillis() + 1000); } }; @Override public void onAttachedToWindow(){ Log.d(LOG_TAG, "Attached to Window"); int h = mapView.getLayoutParams().height; int w = mapView.getLayoutParams().width; Log.d(LOG_TAG, " Attached to window: w = "+w+" , h = "+h); //mapController.zoomInFixing(screenPoint.x, screenPoint.y); } public void onWindowFocusChanged(boolean hasFocus){ Log.d(LOG_TAG, "Focus changed to: "+hasFocus); int h = mapView.getLayoutParams().height; int w = mapView.getLayoutParams().width; Log.d(LOG_TAG, " Window focus changed: w = "+w+" , h = "+h); //mapController.zoomInFixing(screenPoint.x, screenPoint.y); } @Override protected void onStart(){ super.onStart(); // Create and register the broadcast receiver for messages from service IntentFilter filter = new IntentFilter(AppConstants.iGAME_UPDATE); r = new UpdateBroadcastReceiver(); registerReceiver(r, filter); // Create the dialog for end of game ctw = new ContextThemeWrapper(mContext, android.R.style.Theme_Translucent_NoTitleBar_Fullscreen); endDialog = new AlertDialog.Builder(ctw); endDialog.setMessage("End of Game"); endDialog.setCancelable(false); endDialog.setNeutralButton("OK", new OnClickListener(){ @Override public void onClick(DialogInterface dialog, int which) { Intent highScores = new Intent(AMap.this, HighScores.class); startActivity(highScores); playersCursor.close(); finish(); } }); } @Override protected void onStop() { if(!playersCursor.isClosed()) playersCursor.close(); unregisterReceiver(r); mdba.close(); super.onStop(); } @Override protected boolean isRouteDisplayed() { return false; } // Receives signal from NetworkService that DB has been updated public class UpdateBroadcastReceiver extends BroadcastReceiver { boolean startSignal, update, endSignal; @Override public void onReceive(Context context, Intent intent) { endSignal = intent.getBooleanExtra("endSignal", false); if(endSignal){ Log.d(LOG_TAG, "Game Update BroadcastReceiver received End Signal"); endTask = new EndTask(); endTask.execute(); return; } update = intent.getBooleanExtra("update", false); if(update){ Log.d(LOG_TAG, "Game Update BroadcastReceiver received game update"); playersCursor.requery(); mapView.invalidate(); return; } startSignal = intent.getBooleanExtra("startSignal", false); if(startSignal){ Log.d(LOG_TAG, "Game Update BroadcastReceiver received Start Signal"); startTask = new StartTask(); startTask.execute(); return; } } } class StartTask extends AsyncTask<Void,Integer,Void>{ private final ToneGenerator tg = new ToneGenerator(AudioManager.STREAM_NOTIFICATION, 100); private final long DELAY = 1200; @Override protected Void doInBackground(Void... params) { int i = 3; while(i>=0){ publishProgress(i); try { Thread.sleep(DELAY); } catch (InterruptedException e) { e.printStackTrace(); } i--; } return null; } @Override protected void onProgressUpdate(Integer... progress){ tg.startTone(ToneGenerator.TONE_PROP_PROMPT); tvCountdown.setText(""+progress[0]); } @Override protected void onPostExecute(Void result) { rl.removeView(tvCountdown); timer.setBase(SystemClock.elapsedRealtime()); timer.start(); //enable screen touches playersOverlay.setGameStarted(true); } } class EndTask extends AsyncTask<Void,Void,Void>{ @Override protected void onPreExecute(){ //disable screen touches playersOverlay.setEndOfGame(true); timer.stop(); } @Override protected Void doInBackground(Void... params) { return null; } @Override protected void onPostExecute(Void result) { try{ endDialog.show(); }catch(Exception e){ Toast.makeText(mContext, "End of game", Toast.LENGTH_LONG); Intent highScores = new Intent(AMap.this, HighScores.class); startActivity(highScores); playersCursor.close(); finish(); } mHandler.removeCallbacks(mUpdateTimeTask); } } }

    Read the article

  • Best practices for creating a logger library using log4net. Is

    - by VolleyBall Player
    My goal is to create a log4net library that can be shared across multiple projects. In my solution which is in .net 4.0, I created a class library called Logger and referenced it from web project. Now I created a logger.config in the class library and put all the configuration in the logger.config file. I then used [assembly: log4net.Config.XmlConfigurator(Watch = true, ConfigFile = "Logger.config")] When I run the web app nothing is getting logged. So I added this line of code in web.config <add key="log4net.Internal.Debug" value="true"/> which gave me debugging info and error information Failed to find configuration section 'log4net' in the application's .config file. Check your .config file for the and elements. The configuration section should look like: I moved the configuration from logger.config to web.config and everything seems to work fine. I don't want the log4net configuration in web.config but have it logger.config as a cleaner approach. The goal is to make other projects use this library and not have to worry about configuration in every project. Now the question is, How do I do this? What am I doing wrong? Any suggestion with code example will be beneficial to everyone. FYI, I am using structure map IOC to reslove the logger before logging to it.

    Read the article

  • Embed Youtube in UIWebView behind transparent img. Wmode transparent and z-index doesn't work

    - by Allisone
    I'm using this code: - (void)embedYouTube:(NSString *)urlString frame:(CGRect)frame { NSString *embedHTML = @"\ <html><head>\ <style type=\"text/css\">\ body {\ background-color: black;\ }\ #container{\ position: relative;\ z-index:1;\ }\ #video,#videoc{\ position:absolute;\ z-index: 1;\ border: none;\ }\ #tv{\ background: transparent url(tv.png) no-repeat;\ width: 320px;\ height: 205px;\ position: absolute;\ top: 0;\ z-index: 999;\ }\ </style>\ </head><body style=\"margin:0\">\ <div id=\"tv\"></div>\ <object id=\"videoc\" width=\"240\" height=\"160\">\ <param name=\"movie\" value=\"%@\"></param>\ <param name=\"wmode\" value=\"transparent\"></param>\ <embed wmode=\"transparent\" id=\"video\" src=\"%@\" type=\"application/x-shockwave-flash\" \ width=\"240\" height=\"160\"></embed>\ </object>\ </body></html>"; NSString *path = [[NSBundle mainBundle] bundlePath]; NSURL *baseURL = [NSURL fileURLWithPath:path]; NSString *html = [NSString stringWithFormat:embedHTML, urlString,urlString]; UIWebView *videoView = [[UIWebView alloc] initWithFrame:frame]; [videoView loadHTMLString:html baseURL:baseURL]; [self.view addSubview:videoView]; [videoView release]; } Its the first time that I use UIWebView and the first time that I use video in iPhone. The video plays, so that's working BUT: I want to have an old school tv (round corners) in foreground with switches and so on. The tv is an image with transparent pixels in the middle, so that a video lying behind the tv will shine through as if the video would be shown on the tv. But first of all the video has a border that I can't remove and second it's always in the foreground. In Safari and in Firefox and Mac it's working. So is it an iPhone thing, could it be that it simply won't work on iPhone ? Or do I have some css/html typos ?

    Read the article

  • Detecting which MCUs to connect on an incoming conference

    - by Fábio Batista
    Hello, SO. I'm working with the OCS UCCAPI, developing a custom OCS client. I'm currently having a hard time detecting what "kind" of Conference my client is being invited to. Using the Office Communicator client, I can start "IM conferences" (by inviting more than 1 person and selecting "start a IM conversation") or "video conferences" (by selecting more than 1 person and selecting "start a video call"). The Office Communicator client, on the invitees' end, starts correctly the appropriate session (just IM, just Video or IM+Video). However, when receiving the conference invite on my custom client, there's no data about the kind of session I'm being invited. I need this information, in order to make a decision whether or not to connect to the AV MCU and capture/show video. I've tried already: When handling _IUccSessionManagerEvents.OnIncomingSession, parse the RemoteSessionDescription property on the UccIncomingInvitationEvent object: no luck, the only data about the conference modality is an element on the XML about the IM being enabled or not (<im available="true"> or <im available="false">), but nothing about the session having video available or not. When handling _IUccConferenceSessionEvents.OnEnter, check the Media property on the UccConferenceSession. Don't work, all media types are present (MESSAGE, AUDIO, VIDEO, DATA e TELEPHONY), regardless of the type of conference I'm being invited. Also when handling _IUccConferenceSessionEvents.OnEnter, check the Entities collection on the UccConferenceView object, to check which MCUs are enabled for this conference. Don't work either, all MUCs are listed as available (IM, AV, DATA and CONTROL), regardless of the type of conference I'm being invited. I'm running out of ideas. Some references I'm using: http://msdn.microsoft.com/en-us/library/bb664307.aspx http://msdn.microsoft.com/en-us/library/dd170830.aspx Thanks a lot.

    Read the article

  • Hibernate limitations on using variables in queries

    - by sammichy
    I had asked the following question I have the following table structure for a table Player Table Player { Long playerID; Long points; Long rank; } Assuming that the playerID and the points have valid values, can I update the rank for all the players based on the number of points in a single query? If two people have the same number of points, they should tie for the rank. And received the answer from Daniel Vassalo (thank you). UPDATE player JOIN (SELECT p.playerID, IF(@lastPoint <> p.points, @curRank := @curRank + 1, @curRank) AS rank, IF(@lastPoint = p.points, @curRank := @curRank + 1, @curRank), @lastPoint := p.points FROM player p JOIN (SELECT @curRank := 0, @lastPoint := 0) r ORDER BY p.points DESC ) ranks ON (ranks.playerID = player.playerID) SET player.rank = ranks.rank; When I try to execute this as a native query in Hibernate, the following exception is thrown. java.lang.IllegalArgumentException: org.hibernate.QueryException: Space is not allowed after parameter prefix ':' Apparently this has been an open issue for the last couple of years, I want to know if the ranking query can be made to work either Without using any variables in the SQL query OR Using any workaround for Hibernate.

    Read the article

  • Error when creating JFrame from JFrame

    - by Aly
    Hi, I have an application that is works fine and the JFrame for it is launched in the constructor of a GameInitializer class which takes in some config parameters. I have tried to create a GUI in which allows the user to specify these config parameters and then click submit. When the user clicks submit a new GameInitializer object is created. The error I am getting is: Exception in thread "AWT-EventQueue-0" java.lang.Error: Cannot call invokeAndWait from the event dispatcher thread at java.awt.EventQueue.invokeAndWait(Unknown Source) at javax.swing.SwingUtilities.invokeAndWait(Unknown Source) at game.player.humanplayer.view.HumanView.update(HumanView.java:43) once submit is called this code is executed: values assigned to parames... new GameInitializer(userName, player, Constants.BLIND_STRUCTURE_FILES.get(blindStructure), handState); Then code in the GameInitializer constructor is: public GameInitializer(String playerName, AbstractPlayer opponent, String blindStructureConfig, AbstractHandState handState){ beginGame(playerName, opponent, blindStructureConfig, handState); } public static void beginGame(String playerName, AbstractPlayer opponent, String blindStructureConfig, AbstractHandState handState){ AbstractDealer dealer; BlindStructure.initialize(blindStructureConfig); AbstractPlayer humanPlayer = new HumanPlayer(playerName, handState); AbstractPlayer[] players = new AbstractPlayer[2]; players[0] = humanPlayer; players[1] = opponent; handState.setTableLayout(players); for(AbstractPlayer player : players){ player.initialize(); } dealer = new Dealer(players, handState); dealer.beginGame(); } It basically cascades down and eventually calls this piece of code in the HumanView class: public void update(final Event event, final ReadableHandState handState, final AbstractPlayer player) { try { SwingUtilities.invokeAndWait(new Runnable() { public void run() { gamePanel.update(event, handState, player); validate(); } }); } catch (InterruptedException e) { e.printStackTrace(); } catch (InvocationTargetException e) { e.printStackTrace(); } if(event.equals(Event.HAND_ENDING)){ try { if(handState.wonByShowdown() || handState.isSplitPot()){ Thread.sleep(3500); } else{ Thread.sleep(1000); } } catch (InterruptedException e) { e.printStackTrace(); } } } Do you have any idea why?

    Read the article

  • Sum of XML duration elements in SQL2008

    - by Matt
    I have a XML column that holds information about my games. Here's a sample of the information looks like. <game xmlns="http://my.name.space" > <move> <player>PlayerA</player> <start movetype="Move">EE5</start> <end movetype="Move">DF6</end> <movetime>PT1S</movetime> </move> <move> <player>PlayerB</player> <start movetype="Move">CG7</start> <end movetype="Move">DE6</end> <movetime>PT3S</movetime> </move> <move> <player>PlayerA</player> <start movetype="Move">FD3</start> <end movetype="Move">EG8</end> <movetime>PT4S</movetime> </move> </game> I'm trying to design an XML query to take the sum of my movetime element. Basically I need the sum of each players move time. So using the above sample, PlayerA would have a total move time of 5 seconds and PlayerB would have a total move time of 3 seconds. Here's the XML query that I've been currently been working with SELECT GameHistory.query('declare default element namespace "http://my.name.space"; data(/game/move/movetime)') AS Value FROM GamesWHERE Id=560 I'm a newbie to XSLT / XPATH functions :P

    Read the article

  • Merging Passed Parameters

    - by Josh Crowder
    I have a two data arrays sent in from a form, one called transloaded and the other video which is the actual form for the model. I need to get [:video_encoded][:url] and save that to [:video][:flash_url] This is the passed arguments or transloaded, when I try and access [:transload][:results][:video_encode] I get nil. print params[:transload] { "assembly_id":"d59b4293b3d79d2ccd1948c02421c6a6", "status":"success", "uploads":{ "video":{ "name":"bbc_one.mp4", "mime":"video/mp4", "ext":"mp4", "size":601104, "meta":{ "width":720, "height":404, "video_fps":25, "video_bitrate":null, "video_format":"avc1", "video_codec":"ffh264", "audio_bitrate":"128k", "audio_codec":"faad", "duration":3.07, "device_vendor":null, "device_name":null, "device_software":null, "latitude":null, "longitude":null }, "url":"http://tmp.transloadit.com/" } }, "results":{ "video_encode":{ "name":"bbc_one.flv", "mime":"video/x-flv", "steps":["encode","export"], "ext":"flv", "size":388317, "meta":{ "width":480, "height":320, "video_fps":25, "video_bitrate":"512k", "video_format":"FLV1", "video_codec":"ffflv", "audio_bitrate":"64k", "audio_codec":"mp3", "duration":3.11, "device_vendor":null, "device_name":null, "device_software":null, "latitude":null, "longitude":null }, "url":"http://s3.transloadit.com/b7deac9c96af6c745e914e25d0350baa/7a/2b09e822265ac2328789b40dcc02ae/bbc_one.flv" }, "video_encode_iphone":{ "name":"bbc_one.qt", "mime":"video/quicktime", "steps":["encode_iphone","export"], "ext":"qt", "size":218236, "meta":{ "width":480, "height":320, "video_fps":25, "video_bitrate":null, "video_format":"avc1", "video_codec":"ffh264", "audio_bitrate":"128k", "audio_codec":"faad", "duration":3.04, "device_vendor":null, "device_name":null, "device_software":null, "latitude":null, "longitude":null }, "url":"http://s3.transloadit.com/31/58bcc80d5345e52a42c9773125e8f0/bbc_one.qt" } } } Here is what I am trying to use video_links = { :flash_url => params[:transload][:results][:video_encode][:url], :mp4_url => params[:transload][:results][:video_encode_iphone][:url] } params[:video].merge(video_links)

    Read the article

  • Why does this Object wonk out & get deleted ?

    - by brainydexter
    Stepping through the debugger, the BBox object is okay at the entry of the function, but as soon as it enters the function, the vfptr object points to 0xccccc. I don't get it. What is causing this ? Why is there a virtual table reference in there when the object is not derived from other class. (Though, it resides in GameObject from which my Player class inherits and I retrieve the BBox from within player. But, why does the BBox have the reference ? Shouldn't it be player who should be maintained in that reference ?) For 1; some code for reference: A. I retrieve the bounding box from player. This returns a bounding box as expected. I then send its address to GetGridCells. const BoundingBox& l_Bbox = l_pPlayer-GetBoundingBox(); boost::unordered_set < Cell*, CellPHash & l_GridCells = GetGridCells ( &l_Bbox ); B. This is where a_pBoundingBox goes crazy and gets that garbage value. boost::unordered_set< Cell*, CellPHash CollisionMgr::GetGridCells(const BoundingBox *a_pBoundingBox) { I think the following code is also pertinent, so I'm sticking this in here anyways: const BoundingBox& Player::GetBoundingBox(void) { return BoundingBox( &GetBoundingSphere() ); } const BoundingSphere& Player::GetBoundingSphere(void) { BoundingSphere& l_BSphere = m_pGeomMesh-m_BoundingSphere; l_BSphere.m_Center = GetPosition(); return l_BSphere; } // BoundingBox Constructor BoundingBox(const BoundingSphere* a_pBoundingSphere); Can anyone please give me some idea as to why this is happening? Also, if you want me to post more code, please do let me know. Thanks!

    Read the article

< Previous Page | 191 192 193 194 195 196 197 198 199 200 201 202  | Next Page >