Search Results

Search found 11050 results on 442 pages for 'video cards'.

Page 158/442 | < Previous Page | 154 155 156 157 158 159 160 161 162 163 164 165  | Next Page >

  • Embed Youtube in UIWebView behind transparent img. Wmode transparent and z-index doesn't work

    - by Allisone
    I'm using this code: - (void)embedYouTube:(NSString *)urlString frame:(CGRect)frame { NSString *embedHTML = @"\ <html><head>\ <style type=\"text/css\">\ body {\ background-color: black;\ }\ #container{\ position: relative;\ z-index:1;\ }\ #video,#videoc{\ position:absolute;\ z-index: 1;\ border: none;\ }\ #tv{\ background: transparent url(tv.png) no-repeat;\ width: 320px;\ height: 205px;\ position: absolute;\ top: 0;\ z-index: 999;\ }\ </style>\ </head><body style=\"margin:0\">\ <div id=\"tv\"></div>\ <object id=\"videoc\" width=\"240\" height=\"160\">\ <param name=\"movie\" value=\"%@\"></param>\ <param name=\"wmode\" value=\"transparent\"></param>\ <embed wmode=\"transparent\" id=\"video\" src=\"%@\" type=\"application/x-shockwave-flash\" \ width=\"240\" height=\"160\"></embed>\ </object>\ </body></html>"; NSString *path = [[NSBundle mainBundle] bundlePath]; NSURL *baseURL = [NSURL fileURLWithPath:path]; NSString *html = [NSString stringWithFormat:embedHTML, urlString,urlString]; UIWebView *videoView = [[UIWebView alloc] initWithFrame:frame]; [videoView loadHTMLString:html baseURL:baseURL]; [self.view addSubview:videoView]; [videoView release]; } Its the first time that I use UIWebView and the first time that I use video in iPhone. The video plays, so that's working BUT: I want to have an old school tv (round corners) in foreground with switches and so on. The tv is an image with transparent pixels in the middle, so that a video lying behind the tv will shine through as if the video would be shown on the tv. But first of all the video has a border that I can't remove and second it's always in the foreground. In Safari and in Firefox and Mac it's working. So is it an iPhone thing, could it be that it simply won't work on iPhone ? Or do I have some css/html typos ?

    Read the article

  • Detecting which MCUs to connect on an incoming conference

    - by Fábio Batista
    Hello, SO. I'm working with the OCS UCCAPI, developing a custom OCS client. I'm currently having a hard time detecting what "kind" of Conference my client is being invited to. Using the Office Communicator client, I can start "IM conferences" (by inviting more than 1 person and selecting "start a IM conversation") or "video conferences" (by selecting more than 1 person and selecting "start a video call"). The Office Communicator client, on the invitees' end, starts correctly the appropriate session (just IM, just Video or IM+Video). However, when receiving the conference invite on my custom client, there's no data about the kind of session I'm being invited. I need this information, in order to make a decision whether or not to connect to the AV MCU and capture/show video. I've tried already: When handling _IUccSessionManagerEvents.OnIncomingSession, parse the RemoteSessionDescription property on the UccIncomingInvitationEvent object: no luck, the only data about the conference modality is an element on the XML about the IM being enabled or not (<im available="true"> or <im available="false">), but nothing about the session having video available or not. When handling _IUccConferenceSessionEvents.OnEnter, check the Media property on the UccConferenceSession. Don't work, all media types are present (MESSAGE, AUDIO, VIDEO, DATA e TELEPHONY), regardless of the type of conference I'm being invited. Also when handling _IUccConferenceSessionEvents.OnEnter, check the Entities collection on the UccConferenceView object, to check which MCUs are enabled for this conference. Don't work either, all MUCs are listed as available (IM, AV, DATA and CONTROL), regardless of the type of conference I'm being invited. I'm running out of ideas. Some references I'm using: http://msdn.microsoft.com/en-us/library/bb664307.aspx http://msdn.microsoft.com/en-us/library/dd170830.aspx Thanks a lot.

    Read the article

  • Merging Passed Parameters

    - by Josh Crowder
    I have a two data arrays sent in from a form, one called transloaded and the other video which is the actual form for the model. I need to get [:video_encoded][:url] and save that to [:video][:flash_url] This is the passed arguments or transloaded, when I try and access [:transload][:results][:video_encode] I get nil. print params[:transload] { "assembly_id":"d59b4293b3d79d2ccd1948c02421c6a6", "status":"success", "uploads":{ "video":{ "name":"bbc_one.mp4", "mime":"video/mp4", "ext":"mp4", "size":601104, "meta":{ "width":720, "height":404, "video_fps":25, "video_bitrate":null, "video_format":"avc1", "video_codec":"ffh264", "audio_bitrate":"128k", "audio_codec":"faad", "duration":3.07, "device_vendor":null, "device_name":null, "device_software":null, "latitude":null, "longitude":null }, "url":"http://tmp.transloadit.com/" } }, "results":{ "video_encode":{ "name":"bbc_one.flv", "mime":"video/x-flv", "steps":["encode","export"], "ext":"flv", "size":388317, "meta":{ "width":480, "height":320, "video_fps":25, "video_bitrate":"512k", "video_format":"FLV1", "video_codec":"ffflv", "audio_bitrate":"64k", "audio_codec":"mp3", "duration":3.11, "device_vendor":null, "device_name":null, "device_software":null, "latitude":null, "longitude":null }, "url":"http://s3.transloadit.com/b7deac9c96af6c745e914e25d0350baa/7a/2b09e822265ac2328789b40dcc02ae/bbc_one.flv" }, "video_encode_iphone":{ "name":"bbc_one.qt", "mime":"video/quicktime", "steps":["encode_iphone","export"], "ext":"qt", "size":218236, "meta":{ "width":480, "height":320, "video_fps":25, "video_bitrate":null, "video_format":"avc1", "video_codec":"ffh264", "audio_bitrate":"128k", "audio_codec":"faad", "duration":3.04, "device_vendor":null, "device_name":null, "device_software":null, "latitude":null, "longitude":null }, "url":"http://s3.transloadit.com/31/58bcc80d5345e52a42c9773125e8f0/bbc_one.qt" } } } Here is what I am trying to use video_links = { :flash_url => params[:transload][:results][:video_encode][:url], :mp4_url => params[:transload][:results][:video_encode_iphone][:url] } params[:video].merge(video_links)

    Read the article

  • Why does my MPMoviePlayerController disappear when I press play?

    - by Digital Robot
    I have a MPMoviePlayerController in a view, something like myMovie = [[MPMoviePlayerController alloc] initWithContentURL:URLfilme]; if (myMovie) { [myMovie setRepeatMode:MPMovieRepeatModeNone]; [myMovie setShouldAutoplay: NO]; [myMovie setScalingMode:MPMovieScalingModeAspectFit]; myMovie.view.frame = vFilme.bounds; [vFilme addSubview:[myMovie view]]; } The movie appears fine, I can scrub it, but when I press play, boooom, it vanishes. I have tried to retain myMovie but nothing changed. I have tried to play a video fullscreen and even using MPMoviePlayerViewController and is still disappears once I tap on play. Even the video player sample by Apple is not working. Is this a bug or what? EDIT Things are getting more interesting. If instead of playing the video manually by tapping on the play button I insert two timers, one to play the video and another one to pause it after 3 seconds, what I see is this: when the play is fired the video disappears and when the pause is fired the video reappears but when it does it has no controls. It is totally frozen, but the app continues to run normally. It is not anything related to video encoding, because I have tried with different videos, including one shot on the iPhone 4 and another shot on 3GS.

    Read the article

  • Multiple marker icons, how to add to google mashup

    - by user351189
    I have created a Google maps mashup, where with a bit of input, I have managed to have a sidebar that links to a video icon/marker that then opens up an info window showing virtual tours. I would, however, like to put different coloured marker icons on the map depending on the category that the video is in. This would be easy enough to do, but my page is made up of a mixture of J-Query and JavaScript all calling to the individual flash files. Could someone help me with the code for adding extra marker icons for different categories? Here is the code: So, after the intial 'var camera;' point, there comes this: function addMarker(point, title, video, details) { var marker = new GMarker(point, {title: title, icon:camera}); GEvent.addListener(marker, "click", function() { if (details) { marker.openInfoWindowTabsHtml([new GInfoWindowTab("Video", video), new GInfoWindowTab("More", details)]); } else { marker.openInfoWindowHtml(video); } }); Then further down, is the code for calling the individual marker image. I would like to add another image to this list - would I start out by calling the new object 'camera-red.image' or something similar? function initialize() { if (GBrowserIsCompatible()) { map = new GMap2(document.getElementById("mapDiv")); map.setCenter(new GLatLng(51.52484592590448, -0.13345599174499512), 17); map.setUIToDefault(); var uclvtSatMapType = createUclVTSatMapType() map.addMapType(uclvtSatMapType); map.setMapType(uclvtSatMapType); camera = new GIcon(G_DEFAULT_ICON); camera.image = "ucl-video.png"; camera.iconSize = new GSize(32,37); camera.iconAnchor = new GPoint(16,35); camera.infoWindowAnchor = new GPoint(16,2); addMarkersToMap(); } The actual map can be found here: link text Thanks.

    Read the article

  • Website. VoteUp or VoteDown Videos. How to restrict users voting multiple times?

    - by DJDonaL3000
    Im working on a website (html,css,javascript, ajax, php,mysql), and I want to restrict the number of times a particular user votes for a particular video. Its similar to the YouTube system where you can voteUp or voteDown a particular video. Each vote involves adding a row to the video.votes table, which logs the time, vote direction(up or down), the client IPaddress( using PHP: $ip = $_SERVER['REMOTE_ADDR']; ), and of course the ID of the video in question. Adding votes is as simple as; (pseudocode): Javascript:onClick( vote( a,b,c,d ) ), which passes variables to PHP insertion script via ajax, and finally we replace the voteing buttons with a "Thank You For Voting" message. THE PROBLEM: If you reload/refresh the page after voting, you can vote again, and again, and again, you get the point. MY QUESTION: How do you limit the amount of times a particular user votes for a particular video?? MY THOUGHTS: Do you use cookies, and add a new cookie with the id of the video. And check for a cookie before you insert a new vote.? OR Before you insert the vote, do you use the IPaddress and the videoID to see if this same user(IP) has voted for this same video(vidID) in the past 24hrs(mktime), and either allow or dissallow the voteInsertion based on this query? OR Do you just not care? Take the assumption that most users are sane, and have better things to do than refresh pages and vote repeatedly.?? Any suggestions or ideas welcome.

    Read the article

  • Google maps and J-Query: Individual markers?

    - by user351189
    Hi there, I have created a Google maps mashup, where with a bit of input, I have managed to have a sidebar that links to a video icon/marker that then opens up an info window showing virtual tours. I would, however, like to put different coloured marker icons on the map depending on the category that the video is in. This would be easy enough to do, but my page is made up of a mixture of J-Query and JavaScript all calling to the individual flash files. Could someone help me with the code for adding extra marker icons for different categories? Here is the code: So, after the intial 'var camera;' point, there comes this: function addMarker(point, title, video, details) { var marker = new GMarker(point, {title: title, icon:camera}); GEvent.addListener(marker, "click", function() { if (details) { marker.openInfoWindowTabsHtml([new GInfoWindowTab("Video", video), new GInfoWindowTab("More", details)]); } else { marker.openInfoWindowHtml(video); } }); Then further down, is the code for calling the individual marker image. I would like to add another image to this list - would I start out by calling the new object 'camera-red.image' or something similar? function initialize() { if (GBrowserIsCompatible()) { map = new GMap2(document.getElementById("mapDiv")); map.setCenter(new GLatLng(51.52484592590448, -0.13345599174499512), 17); map.setUIToDefault(); var uclvtSatMapType = createUclVTSatMapType() map.addMapType(uclvtSatMapType); map.setMapType(uclvtSatMapType); camera = new GIcon(G_DEFAULT_ICON); camera.image = "ucl-video.png"; camera.iconSize = new GSize(32,37); camera.iconAnchor = new GPoint(16,35); camera.infoWindowAnchor = new GPoint(16,2); addMarkersToMap(); } The actual map can be found here: link text Thanks. Gray

    Read the article

  • linq to xml selection/update

    - by gleasonomicon
    If I have the following xml, how would I use linq to xml blank out the date fields in each video node? I wanted to do it for the purpose of a comparison in a unit test. <?xml version="1.0" encoding="utf-8" standalone="yes" ?> <main> <videos> <video> <id>00000000-0000-0000-0000-000000000000</id> <title>Video Title</title> <videourl>http://sample.com</videourl> <thumbnail>http://sample.com</thumbnail> <dateCreated>2011-01-12T18:54:56.7318386-05:00</dateCreated> <dateModified>2011-02-12T18:54:56.7318386-05:00</dateModified> <Numbers> <Number>28</Number> <Number>78</Number> </Numbers> </video> <video> <id>00000000-0000-0000-0000-000000000000</id> <title>Video Title</title> <videourl>http://sample.com</videourl> <thumbnail>http://sample.com</thumbnail> <dateCreated>2011-01-12T18:54:56.7318386-05:00</dateCreated> <dateModified>2011-02-12T18:54:56.7318386-05:00</dateModified> <Numbers> <Number>28</Number> <Number>78</Number> </Numbers> </video> </videos>

    Read the article

  • Lenovo Wi-Fi Replacement

    - by user22910
    I recently got my T500 with a very poor signal Wi-Fi, Thinkpad BGN, a Realtek chipset. I would like to replace my Wi-Fi card with either the Intel WiFi Link 5100 or 5300. However, I read somewhere that Lenovo specfically "whitelist" their Wi-Fi cards to only work with their laptops. I could not find any of the Intel Wi-Fi, moreover any Wi-Fi cards on the Lenovo site. So, I went to hunt around in Amazon and found several sellers. Plus what sort of card do I require? There is a difference between mini cards and the full sized card, though I do not know which one my laptop supports. Here are the specifications for my laptop: http://privatepaste.com/8b0537bce0 I would like to have confirmation which one of these specific cards as posted below will work on my laptop (or the one you recommend to have): Intel Wifi Link 5300 Intel WiFi Link 5100 - Network adapter - PCI Express Mini Card - 802.11b, 802.11a, 802.11g, 802.11n (draft 2.0) Intel WiFi Link 5100 - Network adapter - PCI Express Half Mini Card - 802.11b, 802.11a, 802.11g, 802.11n (draft) Intel WiFi Link 5300 - Network adapter - PCI Express Mini Card - 802.11b, 802.11a, 802.11g, 802.11n (draft)

    Read the article

  • How to multiseat with HW 3d accel on CentOS 6.3 Final?

    - by user35070
    I would like to setup a multiseat configuration on CentOS 6.3 (two video cards, two keyboards, two mice, two monitors) and have hardware accelerated 3D on both monitors. 3D HW acceleration rules out Xephyr. I saw somewhere that recent versions of GDM (3.3 and newer?) don't support multiseat, so do I have to install KDM to make this work? If I just create a duplicate section with new device identifiers in my xorg.conf file, will this 'just work'? Using different ports on the same video card and separate keyboards, mice, and displays, the result was a desktop which spanned both monitors with both keyboards and mice acting as the same input in the GUI. I will power down and put in the new video card and report on the results soon. Both video cards are nvidia. UPDATE after putting in another NVIDIA video card, default behavior (before changing xorg.conf) is that one screen works normally, and both mice and keyboards are connected to it. Changing xorg.conf and the display manager to KDM and following the directions here https://help.ubuntu.com/community/MultiseatX#Ubuntu_10.04_.28Lucid.29 , I have 2 mirrored screens connected to separate video cards, DRI enabled, and 2 mice both connected to the same pointer. Keyboards don't do anything, however, I probably just need to fix a setting in xorg.conf I would still like to get multiseat functionality, eg. separate screens with separate input devices I have verified that the separate X processes are running (see page above) using 'ps aux | grepX [01]'

    Read the article

  • What parts should I get for an ASRock x58 Extreme motherboard

    - by Brad Gilbert
    I just received an ASRock x58 Extreme motherboard, for my post on this question. It was a 2009 Tom's Hardware recommended buy. It is a Core i7 motherboard, with an X58 Express Chipset. It uses DDR3 RAM. What I want to know is, what parts should I get to finish it off. I'm looking for some good bargains, because of a lack of funds. The most taxing game I will probably play on it is OpenTTD. The only parts I currently have that are compatible: A Dynex 400W power supply. It appears to be an ATX 2.1 power supply, with the addition of a -5 rail. Apparently designed to be compatible with most ATX-style motherboards. Several PCI add-in cards. Mostly 10/100 Network cards Some sound cards Some video cards with a VGA connector Plenty of PATA drives. 8 GB - 80 GB Hard-drives A dozen or-so CD-ROM drives, only a handful of them are CD-RW drives. One DVD-ROM drive I have one LCD, with a 15 pin VGA connector, which I salvaged from the dump. The only thing wrong with it was some dead capacitors. It also has a stuck pixel.

    Read the article

  • Hardware, network infrastructure for runnng gaming server nd on VirtualGL

    - by archer
    Foud nice project VirtualGL (http://www.virtualgl.org/). Tried to run 3D fames (EVE Online, Prototype) on server and display the output on thin client using 100Mbps network. Server: Gentoo Linux on AMD Phoenom II x6 3.4Gz, 8GB RAM, 2x NVIDIA 9800 GTX in single session with display resulution 1024x768 on client. Performance is very promising. Going to increase network speed to 1Gbps (using either Ethernet or Fiber) and run 5-6 clients simultenously. My questions are: a) what would be better for network - 1Gbps Ethernet or Fiber (clients are distributed in max 20m around server)? Is that a must to use managed switch for better network performance? b) Should I increase number of video cards to put in SLI on server (going to use Gigabyte GA-890FXA-UD7 which has 6 PCIExpress slots [2 x4, 2 x8 and 2 x16]). Will it impact performance significantly. If I need to increase the number of video cards - what would be better - put 2 banks of video cards with 3 in bank using SLI, or 3 banks with 2 in the bank? Would linux recognize that and properly use all banks of video cards? c) any suggestions on good thin clients supporting 1920x1080 HDMI video and 1Gbps network I understand that my questions can't be answered clearly (unless someone already managed to use this kind of stuff ;)) although any suggestions would be very helpful.

    Read the article

  • convert decrypted .vobs to .avi with ffmpeg on ubuntu

    - by Arcath
    I have a .vob file that has bee ripped from a dvd, when I watch the .vob its very good quality video and 5.1 english audio but when I use ffmpeg it has rubbish video and mono french audio. That was using this command: ffmpeg -i /samba/ripping/vobs/12161840#2.vob -f avi /samba/ripping/avis/test.avi I've tried a few different variations on that but it never comes back with anything good just bigger files with bad video and incorrect sound. I know the videos good and the correct audio streams exist so how do I select a 5.1 track and get good video? ffmpeg gives the .vob details as: Input #0, mpeg, from '/samba/ripping/vobs/12161840#2.vob': Duration: 00:42:05.56, start: 0.287267, bitrate: 5738 kb/s Stream #0.0[0x1e0]: Video: mpeg2video, yuv420p, 720x576 [PAR 64:45 DAR 16:9], 8436 kb/s, 25 fps, 25 tbr, 90k tbn, 50 tbc Stream #0.1[0x80]: Audio: ac3, 48000 Hz, 5.1, s16, 384 kb/s Stream #0.2[0x81]: Audio: ac3, 48000 Hz, 5.1, s16, 384 kb/s Stream #0.3[0x82]: Audio: ac3, 48000 Hz, mono, s16, 192 kb/s Output #0, avi, to '/samba/ripping/avis/test.avi': Metadata: ISFT : Lavf52.64.2 Stream #0.0: Video: mpeg4, yuv420p, 720x576 [PAR 64:45 DAR 16:9], q=2-31, 200 kb/s, 25 tbn, 25 tbc Stream #0.1: Audio: mp2, 48000 Hz, mono, s16, 64 kb/s Stream mapping: Stream #0.0 -> #0.0 Stream #0.3 -> #0.1

    Read the article

  • Keeping Xv Overlay configuration throughout an X session.

    - by kriss
    After upgrading my Linux system from Ubuntu 9.04 to Ubuntu 10.10, I suceeded correcting most problems (all related to Intel 82865G Integrated Graphics Adapter support and compiz is still not working but that's another matter) but for one I only have a partial solution. Whenever I play a video the colors are much too saturated. This is really a problem for tones of skins that appears reddish (everyone seems to be coming back from a ski vacation with deep sun burns). As this effect only occurs with videos, not with pictures, I finally figured out it was related to Video Overlays configuration and I can correct it typing: xvattr -a XV_SATURATION -v 120 This change the default saturation value, which is 500 and much too high in my case, at eye sight the correct value seems to be between 100 and 150. Now my problem is that I have to type the above command each time I run a video. If I type it before running the video it has no effect, if I close the video and open a new one, I have to type it again, etc. I tried to put it in Xsession and (logically) it has no effect either. How could I do to get the correct setting whenever I run a video without typing the above command every time ?

    Read the article

  • PC power supply & normal range for voltages reported in BIOS hardware monitor?

    - by Chris W. Rea
    I'm trying to diagnose whether my computer has an ample power supply. Sometimes when I play a video-intensive game, both monitors lose the video signal, even though the computer remains on and sound playing. A theory I have is: the video card isn't getting sufficient power. I can't imagine it's overheating because the machine is well-ventilated and the video card isn't hot to touch when this happens. Anyway, in my PC's BIOS there's a Hardware Monitor page, and among other voltages reported (such as CPU, DRAM, South Bridge, etc.) I can see the following values: 3.3V 3.152V 5V 4.944V 12V 11.872V Are those the voltages used by peripherals? What voltage should I be referencing if I want to know what my video card (PCI Express) is consuming? What is the normal range of values reported for those? My values above appear to be under by approximately 4.5%, 1.1%, and 1.1% respectively. Is that cause for concern? How else should I be determining if my power supply is "right-sized" for my PC and video card, or am I perhaps barking up the wrong tree?

    Read the article

  • ATI firepro will not detect a second DVI-D monitor

    - by John
    OK so weird issue here. I have previously been running 6 screens off of 3 of the older ATI firepro graphics cards but they had a problem with the heat sink getting too hot and warping the PCB resulting in total failure of the card, to replace my three dead cards I purchased a new-type ATI firepro with the newer heat sink design. I'm only using one at the moment to make sure they've fixed the problem before I waste more money on 2 more cards but this is where things start to get weird. The Firepro's only have one port on them, they connect to two monitors via a splitter cable going from the one port to two DVI connectors for the screens. When I plug two identical monitors in via their DVI inputs not matter what I do windows and Catalyst will only detect one screen. However if I use the VGA input on one of the screens with a VGA - DVI adaptor to plug it in to the card it works fine. This confuses me greatly. I'm currently using the ATI Firepro 2270 Graphics card with identical DELL U2311H screens. I can post the rest of the system spec as well if needed but I wouldn't have thought it would make much difference as it had no problem handling 6 screens before the graphics cards failed. Naturally both catalyst and ATI drivers are the most current version. ATI tech support has been absolutely zero help, they seemed to get stumped as soon as I verified that both screens were plugged in and connected properly. Anyone have any ideas?

    Read the article

  • Figuring out which PC part is faulty

    - by Davy8
    I have an odd scenario and I'm having trouble figuring out which is the faulty component. First of all, the video doesn't work, monitor says it's not getting a signal. Monitor's not faulty (works on other computer) so the first suspect was video card. However 2 things make me think it's not the video card. (Don't have another machine with PCIe around to test definitively) First, the GPU fan is spinning so it's getting power. Second, tried putting in an older PCI video card that is known to be working (pulled out of another working machine) and there's still no video. Normally if it's not the video card I'd suspect the motherboard, but everything's getting power on the mobo, so I'm not sure. The case apparently doesn't have system speakers, so can't hear any of the diagnostic beeps either. Also not sure whether a faulty CPU would cause no image at all either. The parts are brand new so something's going to get RMA'd but I'm not sure which component is to blame in this case. (Only slightly related, but I also accidentally put too much thermal paste on the CPU. The fan/heatsink instructions said to put the whole tube which seemed like a lot compared to previous experience, and as I started squeezing I knew it was definitely too much and stopped at about 1/3 but against my better judgement I didn't wipe any off. I'm not sure whether that would cause problems other than not cooling as effectively as it should)

    Read the article

  • Moving from 1 Linux Partition to Many over USB Mount

    - by Mistiry
    We have devices which use Compact Flash for storage. They work OK, but we recently got industrial-grade CF cards to start using. One of the major problems we get is corruption on the flash card. As it is now, these flash cards run Debian with everything in a single partition. We want to have multiple partitions on the new industrial CF cards to help avoid some of the corruption problems. I booted up the device, and attached a USB CF reader. I then used fdisk to partition the CF card in the USB reader. How can I move the data to these partitions so that it works? I have a partition for each of these directories: /lib /var /root /boot /tmp /home /etc / swap space I imagine I can't just use rsync - do I need to attach a second CF reader with a copy of the CF card, so that it's not active and in-use - and then copy from the first reader to the second? How will the system know where to find its files? I know I'd have to change fstab, but that resides in /etc, which will be on a separate partition...how will it find the fstab file if it can't find /etc? And what about grub? I'm at a loss, perhaps its just because I'm under the weather, or I'm just missing a piece of logic here... Any help is greatly appreciated, this is somewhat urgent as our existing stock is nearing its end and we don't want to purchase anything but these industrial cards, but need to get it working with partitions.

    Read the article

  • 6 Reasons Why You Can’t Move Your Cell Phone To Any Carrier You Want

    - by Chris Hoffman
    You can buy a laptop or Wi-Fi tablet and use it on Wi-Fi anywhere in the world, so why are cell phones and devices with mobile data not portable between different cellular networks in the same country? Unlike with Wi-Fi, there are many different competing cellular network standards — both around the world and within countries. Cellular carriers also like locking you to their specific network and making it difficult to move. That’s what contracts are for. Phone Locking Many phones are sold locked to a specific network. When you buy a phone from a cellular carrier, they often lock that phone to their network so you can’t take it to a competitor’s network. That’s why you’ll often need to unlock a phone before you can move it to a different cellular provider or take it to a different country and use it on a local provider instead of roaming. Cellular carriers will generally unlock your phone for you as long as you’re no longer in a contract with them. However, unlocking a cell phone you’ve paid for without your carrier’s permission is currently a crime in the USA. GSM vs. CDMA Some cellular networks use the GSM (Global System for Mobile Communications) standard, while some use CDMA (Code-division multiple access). Worldwide, most cellular networks use GSM. In the USA, both GSM and CDMA are popular. Verizon, Sprint, and other carriers that use their networks use CDMA. AT&T, T-Mobile, and other carriers that use their networks are use GSM. These are two competing standards and are not interoperable. This means you can’t simply take a phone from Verizon to T-Mobile, or from AT&T to Sprint. These carriers have incompatible phones. CDMA Restrictions CDMA is more restricted than GSM. GSM phones have SIM cards. Simply open the phone, pop out the SIM card, and pop in a new SIM card to switch carriers. (In reality, it’s more complicated thanks to phone locking and other factors here.) CDMA phones don’t have removable modules like this. All CDMA phones ship locked to a specific network and you’d have to get both your old carrier and your new carrier to cooperate to switch phones between them. In reality, many people just consider CDMA phones eternally locked to a specific carrier. Frequencies Different cellular networks throughout the USA and the rest of the world use different frequencies. These radio frequencies have to be supported by your phone’s hardware or your phone simply can’t work on a network using those frequencies. Many GSM phones support three or four bands of frequencies — 900/1800/1900 MHz, 850/1800/1900 MHz, or 850/900/1800/1900 MHz. These are sometimes called “world phones” because they allow easier roaming. This allows the manufacturer to produce a phone that will support all GSM networks in the world and allows their customers to travel with those phones. If your phone doesn’t support the appropriate frequencies, it won’t work on certain networks. LTE Bands When it comes to newer, faster LTE networks, different frequencies are still a concern. LTE frequencies are generally known as “LTE bands.” To use a smartphone on a certain LTE network, that smartphone will have to support that LTE network’s frequency. Different models of phones are often created to work on different LTE networks around the world. However, phones are generally supporting more and more LTE networks and becoming more and more interoperable over time. SIM Card Sizes The SIM cards used in GSM phones come in different sizes. Newer phones use smaller SIM cards to save space and be more compact. This isn’t a big obstacle, as the different sizes of SIM cards — full-size SIM, mini-SIM, micro-SIM, and nano-SIM are actually compatible. The only difference between them is the size of the plastic card surrounding the SIM’s chip. The actual chip is the same size between all the SIM cards. This means you can take an old SIM card and cut the plastic off until it becomes a smaller-size SIM card that fits in a modern phone. Or, you can take a smaller-size SIM card and insert it into a tray so that it becomes a larger-size SIM card that fits in an older phone. Be aware that it’s very possible to damage your SIM card and make it not work properly by cutting it to the wrong dimensions. Your cellular carrier will often be able to cut your SIM card for you or give you a new one if you want to use an old SIM card in a new phone. Hopefully they won’t overcharge you for this service, too. Be sure to check what types of networks, frequencies, and LTE bands your phone supports before trying to move it between networks. You may have to buy a new phone when moving between certain cellular carriers. Image Credit: Morgan on Flickr, 22n on Flickr

    Read the article

  • What is a Delphi version of the C++ header for the DVP7010B video card DLL?

    - by grzegorz1
    I need help with converting c++ header file to delphi. I spent several days on this problem without success. Below is the original header file and my Delphi translation. C++ header #if _MSC_VER > 1000 #pragma once #endif // _MSC_VER > 1000 #ifdef DVP7010BDLL_EXPORTS #define DVP7010BDLL_API __declspec(dllexport) #else #define DVP7010BDLL_API __declspec(dllimport) #endif #define MAXBOARDS 4 #define MAXDEVS 4 #define ID_NEW_FRAME 37810 #define ID_MUX0_NEW_FRAME 37800 #define ID_MUX1_NEW_FRAME 37801 #define ID_MUX2_NEW_FRAME 37802 #define ID_MUX3_NEW_FRAME 37803 typedef enum { SUCCEEDED = 1, FAILED = 0, SDKINITFAILED = -1, PARAMERROR = -2, NODEVICES = -3, NOSAMPLE = -4, DEVICENUMERROR = -5, INPUTERROR = -6, // VERIFYHWERROR = -7 } Res; typedef enum tagAnalogVideoFormat { Video_None = 0x00000000, Video_NTSC_M = 0x00000001, Video_NTSC_M_J = 0x00000002, Video_PAL_B = 0x00000010, Video_PAL_M = 0x00000200, Video_PAL_N = 0x00000400, Video_SECAM_B = 0x00001000 } AnalogVideoFormat; typedef enum { SIZEFULLPAL=0, SIZED1, SIZEVGA, SIZEQVGA, SIZESUBQVGA } VideoSize; typedef enum { STOPPED = 1, RUNNING = 2, UNINITIALIZED = -1, UNKNOWNSTATE = -2 } CapState; class IDVP7010BDLL { public: int AdvDVP_CreateSDKInstence(void **pp); virtual int AdvDVP_InitSDK() PURE; virtual int AdvDVP_CloseSDK() PURE; virtual int AdvDVP_GetNoOfDevices(int *pNoOfDevs) PURE; virtual int AdvDVP_Start(int nDevNum, int SwitchingChans, HWND Main, HWND hwndPreview) PURE; virtual int AdvDVP_Stop(int nDevNum) PURE; virtual int AdvDVP_GetCapState(int nDevNum) PURE; virtual int AdvDVP_IsVideoPresent(int nDevNum, BOOL* VPresent) PURE; virtual int AdvDVP_GetCurFrameBuffer(int nDevNum, int VMux, long* bufSize, BYTE* buf) PURE; virtual int AdvDVP_SetNewFrameCallback(int nDevNum, int callback) PURE; virtual int AdvDVP_GetVideoFormat(int nDevNum, AnalogVideoFormat* vFormat) PURE; virtual int AdvDVP_SetVideoFormat(int nDevNum, AnalogVideoFormat vFormat) PURE; virtual int AdvDVP_GetFrameRate(int nDevNum, int *nFrameRate) PURE; virtual int AdvDVP_SetFrameRate(int nDevNum, int SwitchingChans, int nFrameRate) PURE; virtual int AdvDVP_GetResolution(int nDevNum, VideoSize *Size) PURE; virtual int AdvDVP_SetResolution(int nDevNum, VideoSize Size) PURE; virtual int AdvDVP_GetVideoInput(int nDevNum, int* input) PURE; virtual int AdvDVP_SetVideoInput(int nDevNum, int input) PURE; virtual int AdvDVP_GetBrightness(int nDevNum, int input, long *pnValue) PURE; virtual int AdvDVP_SetBrightness(int nDevNum, int input, long nValue) PURE; virtual int AdvDVP_GetContrast(int nDevNum, int input, long *pnValue) PURE; virtual int AdvDVP_SetContrast(int nDevNum, int input, long nValue) PURE; virtual int AdvDVP_GetHue(int nDevNum, int input, long *pnValue) PURE; virtual int AdvDVP_SetHue(int nDevNum, int input, long nValue) PURE; virtual int AdvDVP_GetSaturation(int nDevNum, int input, long *pnValue) PURE; virtual int AdvDVP_SetSaturation(int nDevNum, int input, long nValue) PURE; virtual int AdvDVP_GPIOGetData(int nDevNum, int DINum, BOOL* value) PURE; virtual int AdvDVP_GPIOSetData(int nDevNum, int DONum, BOOL value) PURE; }; Delphi unit IDVP7010BDLL_h; interface uses Windows, Messages, SysUtils, Classes; //{$if _MSC_VER > 1000} //pragma once //{$endif} // _MSC_VER > 1000 {$ifdef DVP7010BDLL_EXPORTS} //const DVP7010BDLL_API = __declspec(dllexport); {$else} //const DVP7010BDLL_API = __declspec(dllimport); {$endif} const MAXDEVS = 4; MAXMUXS = 4; ID_NEW_FRAME = 37810; ID_MUX0_NEW_FRAME = 37800; ID_MUX1_NEW_FRAME = 37801; ID_MUX2_NEW_FRAME = 37802; ID_MUX3_NEW_FRAME = 37803; // TRec SUCCEEDED = 1; FAILED = 0; SDKINITFAILED = -1; PARAMERROR = -2; NODEVICES = -3; NOSAMPLE = -4; DEVICENUMERROR = -5; INPUTERROR = -6; // TRec // TAnalogVideoFormat Video_None = $00000000; Video_NTSC_M = $00000001; Video_NTSC_M_J = $00000002; Video_PAL_B = $00000010; Video_PAL_M = $00000200; Video_PAL_N = $00000400; Video_SECAM_B = $00001000; // TAnalogVideoFormat // TCapState STOPPED = 1; RUNNING = 2; UNINITIALIZED = -1; UNKNOWNSTATE = -2; // TCapState type TCapState = Longint; TRes = Longint; TtagAnalogVideoFormat = DWORD; TAnalogVideoFormat = TtagAnalogVideoFormat; PAnalogVideoFormat = ^TAnalogVideoFormat; TVideoSize = ( SIZEFULLPAL, SIZED1, SIZEVGA, SIZEQVGA, SIZESUBQVGA); PVideoSize = ^TVideoSize; P_Pointer = ^Pointer; TIDVP7010BDLL = class function AdvDVP_CreateSDKInstence(pp: P_Pointer): integer; virtual; stdcall; abstract; function AdvDVP_InitSDK():Integer; virtual; stdcall; abstract; function AdvDVP_CloseSDK():Integer; virtual; stdcall; abstract; function AdvDVP_GetNoOfDevices(pNoOfDevs : PInteger) :Integer; virtual; stdcall; abstract; function AdvDVP_Start(nDevNum : Integer; SwitchingChans : Integer; Main : HWND; hwndPreview: HWND ) :Integer; virtual; stdcall; abstract; function AdvDVP_Stop(nDevNum : Integer ):Integer; virtual; stdcall; abstract; function AdvDVP_GetCapState(nDevNum : Integer ):Integer; virtual; stdcall; abstract; function AdvDVP_IsVideoPresent(nDevNum : Integer; VPresent : PBool) :Integer; virtual; stdcall; abstract; function AdvDVP_GetCurFrameBuffer(nDevNum : Integer; VMux : Integer; bufSize : PLongInt; buf : PByte) :Integer; virtual; stdcall; abstract; function AdvDVP_SetNewFrameCallback(nDevNum : Integer; callback : Integer ) :Integer; virtual; stdcall; abstract; function AdvDVP_GetVideoFormat(nDevNum : Integer; vFormat : PAnalogVideoFormat) :Integer; virtual; stdcall; abstract; function AdvDVP_SetVideoFormat(nDevNum : Integer; vFormat : TAnalogVideoFormat ) :Integer; virtual; stdcall; abstract; function AdvDVP_GetFrameRate(nDevNum : Integer; nFrameRate : Integer) :Integer; virtual; stdcall; abstract; function AdvDVP_SetFrameRate(nDevNum : Integer; SwitchingChans : Integer; nFrameRate : Integer) :Integer; virtual; stdcall; abstract; function AdvDVP_GetResolution(nDevNum : Integer; Size : PVideoSize) :Integer; virtual; stdcall; abstract; function AdvDVP_SetResolution(nDevNum : Integer; Size : TVideoSize ) :Integer; virtual; stdcall; abstract; function AdvDVP_GetVideoInput(nDevNum : Integer; input : PInteger) :Integer; virtual; stdcall; abstract; function AdvDVP_SetVideoInput(nDevNum : Integer; input : Integer) :Integer; virtual; stdcall; abstract; function AdvDVP_GetBrightness(nDevNum : Integer; input: Integer; pnValue : PLongInt) :Integer; virtual; stdcall; abstract; function AdvDVP_SetBrightness(nDevNum : Integer; input: Integer; nValue : LongInt) :Integer; virtual; stdcall; abstract; function AdvDVP_GetContrast(nDevNum : Integer; input: Integer; pnValue : PLongInt) :Integer; virtual; stdcall; abstract; function AdvDVP_SetContrast(nDevNum : Integer; input: Integer; nValue : LongInt) :Integer; virtual; stdcall; abstract; function AdvDVP_GetHue(nDevNum : Integer; input: Integer; pnValue : PLongInt) :Integer; virtual; stdcall; abstract; function AdvDVP_SetHue(nDevNum : Integer; input: Integer; nValue : LongInt) :Integer; virtual; stdcall; abstract; function AdvDVP_GetSaturation(nDevNum : Integer; input: Integer; pnValue : PLongInt) :Integer; virtual; stdcall; abstract; function AdvDVP_SetSaturation(nDevNum : Integer; input: Integer; nValue : LongInt) :Integer; virtual; stdcall; abstract; function AdvDVP_GPIOGetData(nDevNum : Integer; DINum:Integer; value : PBool) :Integer; virtual; stdcall; abstract; function AdvDVP_GPIOSetData(nDevNum : Integer; DONum:Integer; value : Boolean) :Integer; virtual; stdcall; abstract; end; function IDVP7010BDLL : TIDVP7010BDLL ; stdcall; implementation function IDVP7010BDLL; external 'DVP7010B.dll'; end.

    Read the article

  • MPMoviePlayerController fullscreen movie inside a UIWebView

    - by Wakazors
    Hi, I'm having a problem with the UIWebView and MPMoviePlayerController: My UIWebView have a movie inside the html (it's a local html file), I'm using html5 and a video tag for the video. The problem is: the user can set the video to play inline, directly on the html or he can tap the fullscreen button, but I need to know if the video is playing fullscreen. I've tried to use MPMoviePlayerDidEnterFullscreenNotification but with no success. Does anybody know how to get this notification from the webview? Thanks in advance

    Read the article

  • Android - Buffering in MediaPlayer

    - by Chris
    I am using MediaPlayer to play a video in my app. The video takes a while to buffer and the videoview is blank for that time. Is there a way to start the buffering when the user is in the previous screen, so that when he comes to the video playing screen, the video is ready to play? Thanks Chris

    Read the article

  • Ruby on Rails has_one Model Not Supplying ID Column

    - by Metric Scantlings
    I have a legacy rails (version 1.2.3) app which runs without issue on a number of servers (not to mention my local environment). Deployed to its newest server, though, and I now get ActiveRecord::StatementInvalid: Mysql::Error: #23000Column 'video_id' cannot be null errors. Below are the models/relationships, simplified: class Video < ActiveRecord::Base has_one(:user, :dependent => :destroy) end class User < ActiveRecord::Base belongs_to(:video) end And below is a rails console transcript of the relationships failing: >> video = Video.create(:title => 'New Video') => #<Video:0xb6d5e31c>... >> video.id => 5 >> video.user = User.create(:name => 'Tester') ActiveRecord::StatementInvalid: Mysql::Error: #23000Column 'video_id' cannot be null: INSERT INTO users (`name`, `video_id`) VALUES('Tester', NULL) from /usr/lib/ruby/gems/1.8/gems/activerecord-1.15.3/lib/active_record/connection_adapters/abstract_adapter.rb:128:in `log' from /usr/lib/ruby/gems/1.8/gems/activerecord-1.15.3/lib/active_record/connection_adapters/mysql_adapter.rb:243:in `execute' from /usr/lib/ruby/gems/1.8/gems/activerecord-1.15.3/lib/active_record/connection_adapters/mysql_adapter.rb:253:in `insert' from /usr/lib/ruby/gems/1.8/gems/activerecord-1.15.3/lib/active_record/base.rb:1811:in `create_without_callbacks' from /usr/lib/ruby/gems/1.8/gems/activerecord-1.15.3/lib/active_record/callbacks.rb:254:in `create_without_timestamps' from /usr/lib/ruby/gems/1.8/gems/activerecord-1.15.3/lib/active_record/timestamp.rb:39:in `create' from /usr/lib/ruby/gems/1.8/gems/activerecord-1.15.3/lib/active_record/base.rb:1789:in `create_or_update_without_callbacks' from /usr/lib/ruby/gems/1.8/gems/activerecord-1.15.3/lib/active_record/callbacks.rb:242:in `create_or_update' from /usr/lib/ruby/gems/1.8/gems/activerecord-1.15.3/lib/active_record/base.rb:1545:in `save_without_validation' from /usr/lib/ruby/gems/1.8/gems/activerecord-1.15.3/lib/active_record/validations.rb:752:in `save_without_transactions' from /usr/lib/ruby/gems/1.8/gems/activerecord-1.15.3/lib/active_record/transactions.rb:129:in `save' from /usr/lib/ruby/gems/1.8/gems/activerecord-1.15.3/lib/active_record/connection_adapters/abstract/database_statements.rb:59:in `transaction' from /usr/lib/ruby/gems/1.8/gems/activerecord-1.15.3/lib/active_record/transactions.rb:95:in `transaction' from /usr/lib/ruby/gems/1.8/gems/activerecord-1.15.3/lib/active_record/transactions.rb:121:in `transaction' from /usr/lib/ruby/gems/1.8/gems/activerecord-1.15.3/lib/active_record/transactions.rb:129:in `save' from /usr/lib/ruby/gems/1.8/gems/activerecord-1.15.3/lib/active_record/base.rb:451:in `create' from (irb):3 from :0 Has anyone else come across ActiveRecord not sending an ID when it clearly knows it?

    Read the article

< Previous Page | 154 155 156 157 158 159 160 161 162 163 164 165  | Next Page >