Search Results

Search found 6616 results on 265 pages for 'media coverage'.

Page 57/265 | < Previous Page | 53 54 55 56 57 58 59 60 61 62 63 64  | Next Page >

  • Use a media player in Linux just to play files from an iPod device (no sync, no manage, just play)?

    - by Somebody still uses you MS-DOS
    I have an ipod classic 160gb, that I sync with my machine at home. I use Linux at work, and want to just plug my ipod and just listen to the tracks, with all the playlists and such. I don't want to sync nothing, I just want to listen to the tracks as if I was using the ipod itself. Why? Because this way I can use the usb port. So, I don't want to manage my ipod in Linux, I just want to listen to the tracks on it in Linux, like it was a local library but it's instead in my ipod. (I've tried gtkpod, it works to show my files, but I can't play, shuffle, etc. It would be interesting to have a complete audio software to handle everything like it was a local library)

    Read the article

  • Quality wise, is Windows Media Audio 10 Professional equivalent to WMA?

    - by Louis
    I noticed that for encoding CD rips, Zune is still using WMA 9.2 instead of WMA 10 Pro. On a given file using the highest quality VBR settings looks like this: VBR Quality 98, 44 kHz, stereo 1-pass VBR On the same file if I use WMA 10 Pro, with the same settings, the resulting file is about 20% smaller. Using my ears, I'm unable to tell the difference, but I'm wondering if this was the goal of WMA 10 Pro (to be as good as WMA at a lower bitrate). Is the quality of a WMA 10 Pro file equal to that of a WMA 9.2 file encoded with the same settings?

    Read the article

  • How would I write a terminal command to download a folder with wget from a Media Temple (gs) server?

    - by racl101
    I'm trying to download a folder using wget on the Terminal (I'm usin a Mac if that matters) because my ftp client sucks and keeps timing out. It doesn't stay connected for long. So I was wondering if I could use wget to connect via ftp protocol to the server to download the directory in question. I have searched around in the internet for this and have attempted to write the command but it keeps failing. So assuming the following: ftp username is: [email protected] ftp host is: ftp.s12345.gridserver.com ftp password is: somepassword I have tried to write the command in the following ways: wget -r ftp://[email protected]:[email protected]/path/to/desired/folder/ wget -r ftp://serveradmin:[email protected]/path/to/desired/folder/ When I try the first way I get this error: Bad port number. When I try the second way I get a little further but I get this error: Resolving s12345.gridserver.com... 71.46.226.79 Connecting to s12345.gridserver.com|71.46.226.79|:21... connected. Logging in as serveradmin ... Login incorrect. What could I be doing wrong?

    Read the article

  • Random Access Violation Exception in WPF Application

    - by PT1984
    Hi, I am facing weird problem while running regression tests on my WPF Application. I am getting AccessViolationException with different stacktraces each time. First : Message :Attempted to read or write protected memory. This is often an indication that other memory is corrupt. StackTrace : at MS.Win32.PresentationCore.UnsafeNativeMethods.MILUnknown.Release(IntPtr pIUnkown) at MS.Win32.PresentationCore.UnsafeNativeMethods.MILUnknown.ReleaseInterface(IntPtr& ptr) at System.Windows.Media.SafeMILHandle.ReleaseHandle() at System.Runtime.InteropServices.SafeHandle.InternalFinalize() at System.Runtime.InteropServices.SafeHandle.Dispose(Boolean disposing) at System.Runtime.InteropServices.SafeHandle.Finalize() Source :PresentationCore Type : System.AccessViolationException. Second : Message :Attempted to read or write protected memory. This is often an indication that other memory is corrupt. StackTrace : at MS.Win32.PresentationCore.UnsafeNativeMethods.IMILBitmapEffect.GetOutput(SafeHandle THIS_PTR, UInt32 uiIndex, SafeMILHandle pContext, BitmapSourceSafeMILHandle& ppBitmapSource) at System.Windows.Media.Effects.BitmapEffect.GetOutput(SafeHandle unmanagedEffect, Int32 index, BitmapEffectRenderContext context) at System.Windows.Media.Effects.BitmapEffect.GetOutput(BitmapEffectInput input) at System.Windows.Media.Effects.BitmapEffectState.GetEffectOutput(Visual visual, RenderTargetBitmap& renderBitmap, Matrix worldTransform, Rect windowClip, Matrix& finalTransform) at System.Windows.Media.Effects.BitmapEffectVisualState.RenderBitmapEffect(Visual visual, Channel channel) at System.Windows.Media.Effects.BitmapEffectContent.ExecuteRealizationsUpdate() at System.Windows.Media.RealizationContext.RealizationUpdateSchedule.Execute() at System.Windows.Media.MediaContext.Render(ICompositionTarget resizedCompositionTarget) at System.Windows.Media.MediaContext.RenderMessageHandlerCore(Object resizedCompositionTarget) at System.Windows.Media.MediaContext.AnimatedRenderMessageHandler(Object resizedCompositionTarget) at System.Windows.Threading.ExceptionWrapper.InternalRealCall(Delegate callback, Object args, Boolean isSingleParameter) at System.Windows.Threading.ExceptionWrapper.TryCatchWhen(Object source, Delegate callback, Object args, Boolean isSingleParameter, Delegate catchHandler) at System.Windows.Threading.Dispatcher.WrappedInvoke(Delegate callback, Object args, Boolean isSingleParameter, Delegate catchHandler) at System.Windows.Threading.DispatcherOperation.InvokeImpl() at System.Windows.Threading.DispatcherOperation.InvokeInSecurityContext(Object state) at System.Threading.ExecutionContext.runTryCode(Object userData) at System.Runtime.CompilerServices.RuntimeHelpers.ExecuteCodeWithGuaranteedCleanup(TryCode code, CleanupCode backoutCode, Object userData) at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state) at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state) at System.Windows.Threading.DispatcherOperation.Invoke() at System.Windows.Threading.Dispatcher.ProcessQueue() at System.Windows.Threading.Dispatcher.WndProcHook(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam, Boolean& handled) at MS.Win32.HwndWrapper.WndProc(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam, Boolean& handled) at MS.Win32.HwndSubclass.DispatcherCallbackOperation(Object o) at System.Windows.Threading.ExceptionWrapper.InternalRealCall(Delegate callback, Object args, Boolean isSingleParameter) at System.Windows.Threading.ExceptionWrapper.TryCatchWhen(Object source, Delegate callback, Object args, Boolean isSingleParameter, Delegate catchHandler) at System.Windows.Threading.Dispatcher.WrappedInvoke(Delegate callback, Object args, Boolean isSingleParameter, Delegate catchHandler) at System.Windows.Threading.Dispatcher.InvokeImpl(DispatcherPriority priority, TimeSpan timeout, Delegate method, Object args, Boolean isSingleParameter) at System.Windows.Threading.Dispatcher.Invoke(DispatcherPriority priority, Delegate method, Object arg) at MS.Win32.HwndSubclass.SubclassWndProc(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam) at MS.Win32.UnsafeNativeMethods.DispatchMessage(MSG& msg) at System.Windows.Threading.Dispatcher.PushFrameImpl(DispatcherFrame frame) at System.Windows.Threading.Dispatcher.PushFrame(DispatcherFrame frame) at System.Windows.Threading.Dispatcher.Run() at System.Windows.Application.RunDispatcher(Object ignore) at System.Windows.Application.RunInternal(Window window) at System.Windows.Application.Run(Window window) at System.Windows.Application.Run() at Main() Source :PresentationCore Type :System.AccessViolationException In Application Event Log I found following entries : Dispatcher processing has been suspended, but messages are still being processed. Faulting application **.exe, version 1.0.0.*, stamp 4c08d288, faulting module wpfgfx_v0300.dll, version 3.0.6920.1427, stamp 488f3056, debug? 0, fault address 0x0012ec36. My Application uses Dispatcher from another thread, to change the values of the controls , enable - disable those, change visibility etc., the thred is run multiple times in a second. Please let me know if anybody has faced this problem? Thanks in advance, -Prasad

    Read the article

  • PHPUnit XDebug required

    - by poru
    Hello, I finished installation of PHPUnit, it works but I don't get a code coverage report. I'm working on windows. My phpunit.xml <phpunit bootstrap="./application/bootstrap.php" colors="false"> <testsuite name="Application"> <directory>./</directory> </testsuite> <filter> <whitelist> <directory suffix=".php">./application</directory> <directory suffix=".php">./library/Application</directory> <exclude> <directory suffix=".php">../application/libraries/Zend</directory> <directory suffix=".php">../application/controllers</directory> <directory suffix=".phtml">./application/</directory> <file>./application/Bootstrap.php</file> </exclude> </whitelist> </filter> <logging> <log type="coverage-html" target="./log/report" charset="UTF-8" yui="true" highlight="true" lowUpperBound="50" highLowerBound="80" /> <log type="testdox" target="./log/testdox.html" /> </logging> If I run on cmd phpunit --configuration phpunit.xml it works so far, but PHPUnit doesn't create a code coverage report. If I run phpunit --configuration phpunit.xml --coverage-html \log or phpunit --configuration phpunit.xml --coverage-html log I get the error The Xdebug extension is not loaded. But I installed it (version 2.0.5)! phpinfo() says I installed it, also var_dump(extension_loaded('xdebug')) I get true. I installed it as Zend Extension and I tried also as normal extension. Bot worked, but PHPUnit says everytime Xdebug is not loaded!

    Read the article

  • Ask the Readers: Backing Your Files Up – Local Storage versus the Cloud

    - by Asian Angel
    Backing up important files is something that all of us should do on a regular basis, but may not have given as much thought to as we should. This week we would like to know if you use local storage, cloud storage, or a combination of both to back your files up. Photo by camknows. For some people local storage media may be the most convenient and/or affordable way to back up their files. Having those files stored on media under your control can also provide a sense of security and peace of mind. But storing your files locally may also have drawbacks if something happens to your storage media. So how do you know whether the benefits outweigh the disadvantages or not? Here are some possible pros and cons that may affect your decision to use local storage to back up your files: Local Storage Pros You are in control of your data Your files are portable and can go with you when needed if using external or flash drives Files are accessible without an internet connection You can easily add more storage capacity as needed (additional drives, etc.) Cons You need to arrange room for your storage media (if you have multiple externals drives, etc.) Possible hardware failure No access to your files if you forget to bring your storage media with you or it is too bulky to bring along Theft and/or loss of home with all contents due to circumstances like fire If you are someone who is always on the go and needs to travel as lightly as possible, cloud storage may be the perfect way for you to back up and access your files. Perhaps your laptop has a hard-drive failure or gets stolen…unhappy events to be sure, but you will still have a copy of your files available. Perhaps a company wants to make sure their records, files, and other information are backed up off site in case of a major hardware or system failure…expensive and/or frustrating to fix if it happens, but once again there is a nice backup ready to go once things are fixed. As with local storage, here are some possible pros and cons that may influence your choice of cloud storage to back up your files: Cloud Storage Pros No need to carry around flash or bulky external drives All of your files are accessible wherever there is an internet connection No need to deal with local storage media (or its’ upkeep) Your files are still safe if your home is broken into or other unfortunate circumstances occur Cons Your files and data are not 100% under your control Possible hardware failure or loss of files on the part of your cloud storage provider (this could include a disgruntled employee wreaking havoc) No access to your files if you do not have an internet connection The cloud storage provider may eventually shutdown due to financial hardship or other unforeseen circumstances The possibility of your files and data being stolen by hackers due to a security breach on the part of your cloud storage provider You may also prefer to try and cover all of the possibilities by using both local and cloud storage to back up your files. If something happens to one, you always have the other to fall back on. Need access to those files at or away from home? As long as you have access to either your storage media or an internet connection, you are good to go. Maybe you are getting ready to choose a backup solution but are not sure which one would work better for you. Here is your chance to ask your fellow HTG readers which one they would recommend. Got a great backup solution already in place? Then be sure to share it with your fellow readers! How-To Geek Polls require Javascript. Please Click Here to View the Poll. Latest Features How-To Geek ETC The 20 Best How-To Geek Explainer Topics for 2010 How to Disable Caps Lock Key in Windows 7 or Vista How to Use the Avira Rescue CD to Clean Your Infected PC The Complete List of iPad Tips, Tricks, and Tutorials Is Your Desktop Printer More Expensive Than Printing Services? 20 OS X Keyboard Shortcuts You Might Not Know Winter Sunset by a Mountain Stream Wallpaper Add Sleek Style to Your Desktop with the Aston Martin Theme for Windows 7 Awesome WebGL Demo – Flight of the Navigator from Mozilla Sunrise on the Alien Desert Planet Wallpaper Add Falling Snow to Webpages with the Snowfall Extension for Opera [Browser Fun] Automatically Keep Up With the Latest Releases from Mozilla Labs in Firefox 4.0

    Read the article

  • Samba smb.conf read only and read/write accounts

    - by Pieter
    Below you can see my smb.conf, pieter is my admin user read/write on the shares works good with that account. Then I have a leecher account that has been added with smbpasswd -a leecher to the smb users, it is set up so this user only has read access to the shares. This works on MegaSam and on Thumbnails but not on my other drives, leecher does not get any access on the other shares. [global] security = user [MegaSam] comment = MegaSam path = /media/MegaSam browsable = yes guest ok = no read list = leecher write list = pieter create mask = 0755 [SilentBob] comment = SilentBob path = /media/SilentBob browsable = yes guest ok = no read list = leecher write list = pieter create mask = 0755 [Thumbnails] comment = Thumbnails path = /media/Thumbnails browsable = yes guest ok = no read list = leecher write list = pieter create mask = 0755 [Downloads] comment = Downloads path = /media/Downloads browsable = yes guest ok = no read list = leecher write list = pieter create mask = 0755

    Read the article

  • ubuntu automount: only mounting drives as root?

    - by glisignoli
    I'm sharing the /mount dir with smb so users on my network can access use drives added to my linux box. Users are able to read files but not write, modify or delete files or directories. I'm using ubuntu 10.04 server edition with halevt installed for usb auto mounting. Afaik halevt is automounting the drives to /media/ but the drives are showing up as: drwxrwxr-x 1 root root 20480 2010-12-29 20:40 disk drwxrwxr-x 1 root root 24576 2010-12-21 17:20 Sparta mount gives me: /dev/sda1 on /boot type ext2 (rw) /dev/sdb1 on /media/disk type fuseblk (rw,nosuid,nodev,sync,allow_other,blksize=4096,default_permissions) /dev/sdc1 on /media/Sparta type fuseblk (rw,nosuid,nodev,sync,allow_other,blksize=4096,default_permissions) When I umount the drives, the folders /media/disk and /media/Sparta are both removed. I tried changing the permissions with chown to nobody:nogroup but it doesn't work (which I assume is because they are ntfs drives).

    Read the article

  • Ubuntu confuses my partitions

    - by Diego
    I have 3 relevant partitions split between 2 disks, sda2: Windows 1 partition sda3: Ubuntu partition sdb1: Data partition I was using pysdm to add a label to my partitions and somehow I seem to have screwed up my installation. Now, every time I access the Data partition mounted in /media/Data I see the files in my Windows partition, and viceversa. I've tried unmounting and remounting correctly to no avail, it seems that wherever I mount sda2, if I access that folder I get the files in sdb1, and viceversa. Anyone know what may have happened and how to solve this? Update: This is the result of blkid: /dev/sda1: LABEL="System Reserved" UUID="C62603F02603E073" TYPE="ntfs" /dev/sda2: LABEL="Windows" UUID="00A6D498A6D49010" TYPE="ntfs" /dev/sda5: UUID="033cac3b-6f77-4f09-a629-495dc866866a" TYPE="ext4" /dev/sdb1: LABEL="Data" UUID="BCD83AE3D83A9B98" TYPE="ntfs" These are the contents of my ftsab file: UUID=033cac3b-6f77-4f09-a629-495dc866866a / ext4 errors=remount-ro,user_xattr 0 1 /dev/sda1 /media/Boot_old ntfs defaults 0 0 /dev/sda2 /media/Windows ntfs defaults 0 0 /dev/sdb1 /media/Data ntfs nls=iso8859-1,ro,users,umask=000 0 0

    Read the article

  • PartCover code details

    - by user329814
    I am using PartCover 2.2/2.3(trying with both) on win 7 x64. After generating report and selecting view coverage details, I can see for each method the code coverage. When I click on a method I see on the right list with block, block length, visit count and has source(set to yes). However, it doesn't fill the code like shown here http://www.csharpcity.com/using-partcover-and-nunit-for-code-coverage/. I haven't checked anything, everything is default. Can you tell me how I can see the coverage code? Thank you

    Read the article

  • NTFS partitions mount as root instead of user as set in /etc/fstab

    - by G1bs0n
    I recently upgraded a server to Ubuntu 12.04 with a fresh install and my NTFS partitions won't mount as user at boot but I can mount them as user manually from the console with $ sudo mount -a. Using ntfsfix reports no problems and chkdisk sees nothing wrong under Windows 7. Are the drives not ready to be mounted at boot and default to root instead of user for some reason? Here is my /etc/fstab: UUID=E4E6B30CE6B2DDCC /media/Bowles ntfs-3g defaults,uid=1000,gid=1000,umask=022 0 0 UUID=A040C42340C3FDD2 /media/Burroughs ntfs-3g defaults,uid=1000,gid=1000,umask=022 0 0 UUID=EA022C73022C46C3 /media/DoctorGonzo ntfs-3g defaults,uid=1000,gid=1000,umask=022 0 0 UUID=BA425A384259FA19 /media/Geist ntfs-3g defaults,uid=1000,gid=1000,umask=022 0 0 UUID=E87CFAE57CFAAE06 /media/DouglasAdams ntfs-3g defaults,uid=1000,gid=1000,umask=022 0 0 Here is the output of ls -l after boot: drwxr-xr-x 1 xbmc xbmc 4096 Oct 31 21:46 Bowles drwxrwxr-x 1 root users 8192 Oct 31 21:46 Burroughs drwxrwxr-x 1 root users 4096 Oct 28 21:45 DoctorGonzo drwxrwxr-x 1 root users 12288 Oct 31 19:56 DouglasAdams drwxrwxr-x 1 root users 4096 Nov 3 01:03 Geist If I unmount and mount again with $ sudo mount -a from console, the output of ls -l: drwxr-xr-x 1 xbmc xbmc 4096 Oct 31 21:46 Bowles drwxr-xr-x 1 xbmc xbmc 8192 Oct 31 21:46 Burroughs drwxr-xr-x 1 xbmc xbmc 4096 Oct 28 21:45 DoctorGonzo drwxr-xr-x 1 xbmc xbmc 12288 Oct 31 19:56 DouglasAdams drwxr-xr-x 1 xbmc xbmc 4096 Nov 3 01:03 Geist Update I was fooling myself, I had a custom udev rule set up to auto-mount file systems by label for USB drives, borrowed from here, but didn't update the rule to accommodate for my additional hard drives. Updating the rule to auto-mount only drives after /dev/sde solved my problem. Thank you again for your reply cartoonist.

    Read the article

  • StorageTek SL8500 Release 8.3 available

    - by uwes
    Boosting Performance and Enhancing Reliability with StorageTek SL8500 Release 8.3! We’re pleased to announce the availability of SL8500 8.3 firmware, which supports partitioning for library complexes, library media validation, drive tray serial number reporting, and StorageTek T10000D tape drives! StorageTek SL8500 8.3 support the following: Library Complex Partitioning: Provides support for partitioning across an SL8500 library complex  Supports up to 16 partitions per library complex   Library Media Validation: Utilizing StorageTek Library Console, users can initiate media verifications with our StorageTek T10000C/D tape drives on StorageTek T10000 T1 and T2 media  Supports 3 scan options: basic verify, standard verify and complete verify Please read the Sales Bulletin (Firmware reales 8.31) on Oracle HW TRC for more details. (If you are not registered on Oracle HW TRC, click here ... and follow the instructions..) For More Information Go To: Oracle.com Tape Page Oracle Technology Network Tape Page

    Read the article

  • Indentify Codecs & Technical Information About Video Files

    - by DigitalGeekery
    Have you ever wanted to play an audio or video file but didn’t have the proper codec installed? Today we’ll show how to determine codecs, along with a host of other technical details about your media files with MediaInfo. Installation Download and install MediaInfo. You can find the download link at the bottom of the page. Note: When installing MediaInfo there is a recommended software bundle which you can opt out of by selecting Do not install option. Each recommended software choice may be different, like in this example it offers Spyware Terminator. The cool thing though is they use Open Candy which opts you out of the install. Just double check to make sure you’re not installing extra crapware. Using MediaInfo The first time you run MediaInfo it will display the Preferences window. There are various option such as language, output format, and whether or not you want MediaInfo to check for new versions. Click OK. Select a file or folder to analyze by clicking on the File or Folder icons on the left of the application window or by selecting File > Open from the menu. You can also drag and drop a file directly onto the application. MediaInfo will display details of your media file. In Basic view, you’ll see basic information. Notice in the example below the video and audio codecs, along with file size, running time of the media file, and even the application used to create the video file (Writing application).    You can switch to some of the other views by selecting View from the Menu and choosing form the dropdown list.   Sheet View will present the information a bit more clearly. You can see in the example below that the video and audio codec are listing in clearly identified columns. (AVC is often more commonly referred to H.264.)   Tree View is perhaps the most detailed. You can see from the example below the codec used for this AVI file is XviD.   Scrolling down even further you’ll see additional information like video and audio bit rates, frame rate, aspect ratio, and more.   In Basic View (and also in Sheet view) you can click to find a player for your file. In this instance with an MP4 file, it took me to the download page for Quicktime. This is by no means the only media player for this file, but if you are stuck for how to play a media file, this will forward you to a solution that works. You can do the same thing with Video codec. Click Go to the web site of this video codec to find a download.   MediaInfo is a simple but powerful tool that can be used to discover the details of a media file, or just to find a compatible codec. It works with most any video file type and is available for Windows, Mac, and Linux. Some Mac and Linux versions, however, are currently command line only. Download MediaInfo Similar Articles Productive Geek Tips How to Convert Videos to 3GP for Mobile PhonesFix for VLC Skipping and Lagging Playing High-Def Video FilesUsing VLC Player Under VistaUse Your Mac Mini as a Media Server Part 2How to Play .OGM Video Files in Windows Vista TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Xobni Plus for Outlook All My Movies 5.9 CloudBerry Online Backup 1.5 for Windows Home Server Snagit 10 2010 World Cup Schedule Boot Snooze – Reboot and then Standby or Hibernate Customize Everything Related to Dates, Times, Currency and Measurement in Windows 7 Google Earth replacement Icon (Icons we like) Build Great Charts in Excel with Chart Advisor tinysong gives a shortened URL for you to post on Twitter (or anywhere)

    Read the article

  • Where did my free space go?

    - by Ari B. Friedman
    I have a storage drive (2TB) and an OS drive (90GB SSD). I've run out of space on the OS drive: /$ df -h Filesystem Size Used Avail Use% Mounted on /dev/sdb1 72G 72G 0 100% / udev 5.9G 12K 5.9G 1% /dev tmpfs 2.4G 1.2M 2.4G 1% /run none 5.0M 0 5.0M 0% /run/lock none 5.9G 428K 5.9G 1% /run/shm /dev/sda1 1.9T 639G 1.2T 37% /media/StorageDrive So be it. But when I attempt to figure out where the space has gone, I cannot find it anything remotely approaching the capacity of the drive: /$ sudo du -h -d 1 du: cannot access `./media/StorageDrive/home/ari/.gvfs': Permission denied 675G ./media 2.3G ./var 0 ./proc 7.0M ./tmp 27M ./boot 4.0K ./lib64 12K ./dev 44M ./home 16K ./lost+found 8.0M ./sbin 223M ./lib 4.0K ./selinux 1.4M ./run 140K ./root 8.8M ./bin 4.0K ./mnt 38M ./etc 8.0K ./srv 4.8G ./usr 65M ./opt 0 ./sys 682G . Note the difference between the total (682G) and the mounted drives in /media (675G) is only about 9G. How are 72G being used? Where is this dark matter hiding?

    Read the article

  • Google I/O 2010 - Advanced Android audio techniques

    Google I/O 2010 - Advanced Android audio techniques Google I/O 2010 - Advanced Android audio techniques Android 301 Dave Sparks In this session, we will explore advanced techniques that you can employ in your apps when working with media. This includes using Android's low-level audio APIs, selecting the appropriate format for your media files, and what's now possible using new media framework APIs introduced in Android 2.2. For all I/O 2010 sessions, please go to code.google.com From: GoogleDevelopers Views: 3 0 ratings Time: 57:16 More in Science & Technology

    Read the article

  • Obtaining positional information in the IEnumerable Select extension method

    - by Kyle Burns
    This blog entry is intended to provide a narrow and brief look into a way to use the Select extension method that I had until recently overlooked. Every developer who is using IEnumerable extension methods to work with data has been exposed to the Select extension method, because it is a pretty critical piece of almost every query over a collection of objects.  The method is defined on type IEnumerable and takes as its argument a function that accepts an item from the collection and returns an object which will be an item within the returned collection.  This allows you to perform transformations on the source collection.  A somewhat contrived example would be the following code that transforms a collection of strings into a collection of anonymous objects: 1: var media = new[] {"book", "cd", "tape"}; 2: var transformed = media.Select( item => 3: { 4: Media = item 5: } ); This code transforms the array of strings into a collection of objects which each have a string property called Media. If every developer using the LINQ extension methods already knows this, why am I blogging about it?  I’m blogging about it because the method has another overload that I hadn’t seen before I needed it a few weeks back and I thought I would share a little about it with whoever happens upon my blog.  In the other overload, the function defined in the first overload as: 1: Func<TSource, TResult> is instead defined as: 1: Func<TSource, int, TResult>   The additional parameter is an integer representing the current element’s position in the enumerable sequence.  I used this information in what I thought was a pretty cool way to compare collections and I’ll probably blog about that sometime in the near future, but for now we’ll continue with the contrived example I’ve already started to keep things simple and show how this works.  The following code sample shows how the positional information could be used in an alternating color scenario.  I’m using a foreach loop because IEnumerable doesn’t have a ForEach extension, but many libraries do add the ForEach extension to IEnumerable so you can update the code if you’re using one of these libraries or have created your own. 1: var media = new[] {"book", "cd", "tape"}; 2: foreach (var result in media.Select( 3: (item, index) => 4: new { Item = item, Index = index })) 5: { 6: Console.ForegroundColor = result.Index % 2 == 0 7: ? ConsoleColor.Blue : ConsoleColor.Yellow; 8: Console.WriteLine(result.Item); 9: }

    Read the article

  • Recovering data from /

    - by Abhijit Gavas
    I accidentally installed Ubuntu to one of my data drives from Windows. The drive was a NTFS drive and contained about 80 GB of important data. The size of the drive is 110 GB. Its new file system is ext4. In an attempt to recover the data, I downloaded foremost and tried the following commands: foremost -i / -o /media/281C8DB01C8D7998/Recovery/ -T -v foremost -i /dev/sda7 -o /media/281C8DB01C8D7998/Recovery/ -T -v (sda7 is the drive in question.) It appears that with either command, foremost gets stuck reading some file. Here is the console output: abhi@abi-PC:/dev$ foremost -i /dev/sda7 -o /media/281C8DB01C8D7998/Recovery/ -T -v Foremost version 1.5.7 by Jesse Kornblum, Kris Kendall, and Nick Mikus Audit File Foremost started at Fri Sep 28 20:58:00 2012 Invocation: foremost -i /dev/sda7 -o /media/281C8DB01C8D7998/Recovery/ -T -v Output directory: /media/281C8DB01C8D7998/Recovery_Fri_Sep_28_20_58_00_2012 Configuration file: /etc/foremost.conf Processing: stdin |------------------------------------------------------------------ File: stdin Start: Fri Sep 28 20:58:00 2012 Length: Unknown Num Name (bs=512) Size File Offset Comment Killed As you can see I have to kill it from system monitor. This approach does not seem to be working. What else could I try to recover the files? Please help. The files are very important and I will be devastated if I cannot recover them.

    Read the article

< Previous Page | 53 54 55 56 57 58 59 60 61 62 63 64  | Next Page >