Search Results

Search found 9688 results on 388 pages for 'external hdd'.

Page 197/388 | < Previous Page | 193 194 195 196 197 198 199 200 201 202 203 204  | Next Page >

  • iTunes for Ubuntu Studio

    - by soundblastdj
    I have finally gotten my old Mac HDD sorted out, and now I would like to know if anybody has either: a) a way to run iTunes without wine, as it did not work out well for me the last time I tried it, or b) any other media player that will sync with an iPod and, more importantly, use the same file system. When my Mac died, I started to get into open source. I bought a MacBook Air, only out of necessity. For almost two years now, I have not once backed up or synced my iPod. I am getting nervous that it may give up on it's life soon and would like to find a solution. I don't have enough room on my Air, and it would just erase my iPod anyway... Another thing that I am having trouble with is the way iTunes arranged the music. Now, it is arranged all by artist, then album, the song and I would like to have a media library, but somewhere around 400GB of music is a lot to sift through (I have attempted in the past). Thus I am looking for something that will use the same library format. A side note: As I was writing this I started to wonder; Is a Hackintosh in order here? If somebody will give me instructions on how to install MacOSX for free (maybe Mavericks?) in a dual boot with Ubuntu, I will be ever grateful. :) Thanks, soundblastdj

    Read the article

  • Google PageSpeed, optimizing Google's own elements

    - by mowgli
    I'm trying Google's PageSpeed online service Ironically, it's primarily highlighting Google's own services as something that needs improvement on my site 1) jQuery from Google: blocking. So I moved all javascript from <head> to the end of the document before </body>. That helped 2) Linking to external Google Font CSS (in <head>): blocking. But the font is critical to the design of the page and should load before much else 3) Google Analytics: Caching is not good. (Google has set it internally to 2 hours expiration). Don't know how to change this (this is also placed at the bottom of page) The Google Font is highlighted as a big priority to change. How can I fix this? Where/how should I call the the font?

    Read the article

  • Using Unity – Part 3

    - by nmarun
    The previous blog was about registering and invoking different types dynamically. In this one I’d like to show how Unity manages/disposes the instances – say hello to Lifetime Managers. When a type gets registered, either through the config file or when RegisterType method is explicitly called, the default behavior is that the container uses a transient lifetime manager. In other words, the unity container creates a new instance of the type when Resolve or ResolveAll method is called. Whereas, when you register an existing object using the RegisterInstance method, the container uses a container controlled lifetime manager - a singleton pattern. It does this by storing the reference of the object and that means so as long as the container is ‘alive’, your registered instance does not go out of scope and will be disposed only after the container either goes out of scope or when the code explicitly disposes the container. Let’s see how we can use these and test if something is a singleton or a transient instance. Continuing on the same solution used in the previous blogs, I have made the following changes: First is to add typeAlias elements for TransientLifetimeManager type: 1: <typeAlias alias="transient" type="Microsoft.Practices.Unity.TransientLifetimeManager, Microsoft.Practices.Unity"/> You then need to tell what type(s) you want to be transient by nature: 1: <type type="IProduct" mapTo="Product2"> 2: <lifetime type="transient" /> 3: </type> 4: <!--<type type="IProduct" mapTo="Product2" />--> The lifetime element’s type attribute matches with the alias attribute of the typeAlias element. Now since ‘transient’ is the default behavior, you can have a concise version of the same as line 4 shows. Also note that I’ve changed the mapTo attribute from ‘Product’ to ‘Product2’. I’ve done this to help understand the transient nature of the instance of the type Product2. By making this change, you are basically saying when a type of IProduct needs to be resolved, Unity should create an instance of Product2 by default. 1: public string WriteProductDetails() 2: { 3: return string.Format("Name: {0}<br/>Category: {1}<br/>Mfg Date: {2}<br/>Hash Code: {3}", 4: Name, Category, MfgDate.ToString("MM/dd/yyyy hh:mm:ss tt"), GetHashCode()); 5: } Again, the above change is purely for the purpose of making the example more clear to understand. The display will show the full date and also displays the hash code of the current instance. The GetHashCode() method returns an integer when an instance gets created – a new integer for every instance. When you run the application, you’ll see something like the below: Now when you click on the ‘Get Product2 Instance’ button, you’ll see that the Mfg Date (which is set in the constructor) and the Hash Code are different from the one created on page load. This proves to us that a new instance is created every single time. To make this a singleton, we need to add a type alias for the ContainerControlledLifetimeManager class and then change the type attribute of the lifetime element to singleton. 1: <typeAlias alias="singleton" type="Microsoft.Practices.Unity.ContainerControlledLifetimeManager, Microsoft.Practices.Unity"/> 2: ... 3: <type type="IProduct" mapTo="Product2"> 4: <lifetime type="singleton" /> 5: </type> Running the application now gets me the following output: Click on the button below and you’ll see that the Mfg Date and the Hash code remain unchanged => the unity container is storing the reference the first time it is created and then returns the same instance every time the type needs to be resolved. Digging more deeper into this, Unity provides more than the two lifetime managers. ExternallyControlledLifetimeManager – maintains a weak reference to type mappings and instances. Unity returns the same instance as long as the some code is holding a strong reference to this instance. For this, you need: 1: <typeAlias alias="external" type="Microsoft.Practices.Unity.ExternallyControlledLifetimeManager, Microsoft.Practices.Unity"/> 2: ... 3: <type type="IProduct" mapTo="Product2"> 4: <lifetime type="external" /> 5: </type> PerThreadLifetimeManager – Unity returns a unique instance of an object for each thread – so this effectively is a singleton behavior on a  per-thread basis. 1: <typeAlias alias="perThread" type="Microsoft.Practices.Unity.PerThreadLifetimeManager, Microsoft.Practices.Unity"/> 2: ... 3: <type type="IProduct" mapTo="Product2"> 4: <lifetime type="perThread" /> 5: </type> One thing to note about this is that if you use RegisterInstance method to register an existing object, this instance will be returned for every thread, making this a purely singleton behavior. Needless to say, this type of lifetime management is useful in multi-threaded applications (duh!!). I hope this blog provided some basics on lifetime management of objects resolved in Unity and in the next blog, I’ll talk about Injection. Please see the code used here.

    Read the article

  • UCM 11g is 4 days old!

    - by kyle.hatlestad
    Ok...so I missed posting a blog entry when UCM 11g and the entire ECM suite released on Tuesday. Hopefully you've already seen the announcements on any number of the Oracle ECM blogs out there such as ECM Alerts, Fusion ECM, bex huff, or C4. So I won't bore you with the same talking points like 179 million check-ins per day or 124 web site page hits per second. Instead, I thought I'd show some screenshots of the new features in UCM and URM 11g. WebLogic Server and Enterprise Manager So probably the biggest change in 11g is UCM and URM now run on top of the WebLogic Server application server. This is a huge step as ECM is now on a standard platform with the rest of Oracle Fusion Middleware which makes installation, configuration, and integration consistent among all the products. From a feature perspective, it's also beneficial because it's now integrated with Oracle Enterprise Manager. Enterprise Manager provides a lot of provisioning control over servers as well as performance monitoring and access to logs and debugging information. Desktop Integration Suite Desktop Integration Suite got a complete overhaul for 11g. It exposes a lot more features within Windows Explorer such as saved searches, workflow queue, and checked-out items. It also now support metadata pop-up screens to let users fill in additional metadata when they drag-n-drop files in! And the integration within Office applications has changed significantly by introducing a dedicated UCM menu to do open, save, compare, etc. Site Studio for External Applications In UCM Site Studio 10gR4, a major architectural shift was introduced which brought several new objects such as elements, region definitions, region templates, and placeholder definitions. This truly separated the content from the display and from the definition. It also allowed separation of the content from needing to be rendered on a complete Site Studio page. Well, the new Site Studio for External Applications takes advantage of that architecture and introduces pre-built tags and plug-ins to JDeveloper to allow to go from simply adding a content area to your web application page to building an entire web site, just like you would have done in Site Studio Designer. In addition to these changes, enhancements to the core Site Studio have been added as well. One of the big ones is called Designer Mode which allows power-users to bypass the standard rules defined by the placeholder definition or template and perform any number of additional actions. This reduces the need to go back to Site Studio Designer or JDeveloper to make more advanced changes to the site. Dashboards As part of the updated records management functionality in both UCM and URM, users can now set a dashboard view on their home page to surface common functions in a single view. It has pre-built "portlets" users can choose from to display and organize they way they want. Behind the scenes, these dashboards are stored as Content Folios. So the dashboards themselves are content items that can be revisioned and shared between users. And new dashboard portlets can be easily added (like the User Profile one in the screenshots) by getting a copy of an existing one, modifying the display, and then checking it in as a new one to select from. URM Interface Enhancements URM includes several new UI and usability enhancements in 11g. There is a new view for physical records, a place to configure "favorite" items to quickly get to, and new placement of the records management menu. BI Publisher Reports Records management in UCM and URM now offer reports generated through embedded BI Publisher. Templates are controlled by rich text files checked directly into the repository, so they can be easily modified. Other Features A new Inbound Refinery conversion option is available that does native Microsoft Office HTML conversion. If your IBR is on Windows and you have the native applications loaded, the IBR can use them to produce HTML. A new GUI template editor for Dynamic Converter is available. It's written in Java so is available through all the supported browsers and platforms. The original ActiveX based editor is also still available. The Component Manager interface has changed to help provide an easier and more descriptive way to enable core components that are installed along with UCM. All of the supported components are immediately available to turn on and do not have to be installed separately as in previous versions. My Downloads is located in the My Content Server menu and provides for easy download of client installs including Desktop Integration Suite and Site Studio Designer. Well, hopefully that gives you a taste for some of the new things in 11g. We're all pretty excited here at Oracle about all the new changes and enhancements. Over the next few months I hope to highlight some of these features more in-depth, so keep your eye out for those posts.

    Read the article

  • Sensitive Data Storage - Best Practices

    - by Kenneth
    I recently started working on a personal project where I was connecting to a database using Java. This got me thinking. I have to provide the login information for a database account on the DB server in order to access the database. But if I hard code it in then it would be possible for someone to decompile the program and extract that login info. If I store it in an external setup file then the same problem exists only it would be even easier for them to get it. I could encrypt the data before storing it in either place but it seems like that's not really a fail safe either and I'm no encryption expert by any means. So what are some best practices for storing sensitive setup data for a program?

    Read the article

  • 8 Backup Tools Explained for Windows 7 and 8

    - by Chris Hoffman
    Backups on Windows can be confusing. Whether you’re using Windows 7 or 8, you have quite a few integrated backup tools to think about. Windows 8 made quite a few changes, too. You can also use third-party backup software, whether you want to back up to an external drive or back up your files to online storage. We won’t cover third-party tools here — just the ones built into Windows. Backup and Restore on Windows 7 Windows 7 has its own Backup and Restore feature that lets you create backups manually or on a schedule. You’ll find it under Backup and Restore in the Control Panel. The original version of Windows 8 still contained this tool, and named it Windows 7 File Recovery. This allowed former Windows 7 users to restore files from those old Windows 7 backups or keep using the familiar backup tool for a little while. Windows 7 File Recovery was removed in Windows 8.1. System Restore System Restore on both Windows 7 and 8 functions as a sort of automatic system backup feature. It creates backup copies of important system and program files on a schedule or when you perform certain tasks, such as installing a hardware driver. If system files become corrupted or your computer’s software becomes unstable, you can use System Restore to restore your system and program files from a System Restore point. This isn’t a way to back up your personal files. It’s more of a troubleshooting feature that uses backups to restore your system to its previous working state. Previous Versions on Windows 7 Windows 7′s Previous Versions feature allows you to restore older versions of files — or deleted files. These files can come from backups created with Windows 7′s Backup and Restore feature, but they can also come from System Restore points. When Windows 7 creates a System Restore point, it will sometimes contain your personal files. Previous Versions allows you to extract these personal files from restore points. This only applies to Windows 7. On Windows 8, System Restore won’t create backup copies of your personal files. The Previous Versions feature was removed on Windows 8. File History Windows 8 replaced Windows 7′s backup tools with File History, although this feature isn’t enabled by default. File History is designed to be a simple, easy way to create backups of your data files on an external drive or network location. File History replaces both Windows 7′s Backup and Previous Versions features. Windows System Restore won’t create copies of personal files on Windows 8. This means you can’t actually recover older versions of files until you enable File History yourself — it isn’t enabled by default. System Image Backups Windows also allows you to create system image backups. These are backup images of your entire operating system, including your system files, installed programs, and personal files. This feature was included in both Windows 7 and Windows 8, but it was hidden in the preview versions of Windows 8.1. After many user complaints, it was restored and is still available in the final version of Windows 8.1 — click System Image Backup on the File History Control Panel. Storage Space Mirroring Windows 8′s Storage Spaces feature allows you to set up RAID-like features in software. For example, you can use Storage Space to set up two hard disks of the same size in a mirroring configuration. They’ll appear as a single drive in Windows. When you write to this virtual drive, the files will be saved to both physical drives. If one drive fails, your files will still be available on the other drive. This isn’t a good long-term backup solution, but it is a way of ensuring you won’t lose important files if a single drive fails. Microsoft Account Settings Backup Windows 8 and 8.1 allow you to back up a variety of system settings — including personalization, desktop, and input settings. If you’re signing in with a Microsoft account, OneDrive settings backup is enabled automatically. This feature can be controlled under OneDrive > Sync settings in the PC settings app. This feature only backs up a few settings. It’s really more of a way to sync settings between devices. OneDrive Cloud Storage Microsoft hasn’t been talking much about File History since Windows 8 was released. That’s because they want people to use OneDrive instead. OneDrive — formerly known as SkyDrive — was added to the Windows desktop in Windows 8.1. Save your files here and they’ll be stored online tied to your Microsoft account. You can then sign in on any other computer, smartphone, tablet, or even via the web and access your files. Microsoft wants typical PC users “backing up” their files with OneDrive so they’ll be available on any device. You don’t have to worry about all these features. Just choose a backup strategy to ensure your files are safe if your computer’s hard disk fails you. Whether it’s an integrated backup tool or a third-party backup application, be sure to back up your files.

    Read the article

  • Java ME SDK 3.2 is now live

    - by SungmoonCho
    Hi everyone, It has been a while since we released the last version. We have been very busy integrating new features and making lots of usability improvements into this new version. Datasheet is available here. Please visit Java ME SDK 3.2 download page to get the latest and best version yet! Some of the new features in this version are described below. Embedded Application SupportOracle Java ME SDK 3.2 now supports the new Oracle® Java ME Embedded. This includes support for JSR 228, the Information Module Profile-Next Generation API (IMP-NG). You can test and debug applications either on the built-in device emulators or on your device. Memory MonitorThe Memory Monitor shows memory use as an application runs. It displays a dynamic detailed listing of the memory usage per object in table form, and a graphical representation of the memory use over time. Eclipse IDE supportOracle Java ME SDK 3.2 now officially supports Eclipse IDE. Once you install the Java ME SDK plugins on Eclipse, you can start developing, debugging, and profiling your mobile or embedded application. Skin CreatorWith the Custom Device Skin Creator, you can create your own skins. The appearance of the custom skins is generic, but the functionality can be tailored to your own specifications.  Here are the release highlights. Implementation and support for the new Oracle® Java Wireless Client 3.2 runtime and the Oracle® Java ME Embedded runtime. The AMS in the CLDC emulators has a new look and new functionality (Install Application, Manage Certificate Authorities and Output Console). Support for JSR 228, the Information Module Profile-Next Generation API (IMP-NG). The IMP-NG platform is implemented as a subset of CLDC. Support includes: A new emulator for headless devices. Javadocs for the following Oracle APIs: Device Access API, Logging API, AMS API, and AccessPoint API. New demos for IMP-NG features can be run on the emulator or on a real device running the Oracle® Java ME Embedded runtime. New Custom Device Skin Creator. This tool provides a way to create and manage custom emulator skins. The skin appearance is generic, but the functionality, such as the JSRs supported or the device properties, are up to you. This utility only supported in NetBeans. Eclipse plugin for CLDC/MIDP. For the first time Oracle Java ME SDK is available as an Eclipse plugin. The Eclipse version does not support CDC, the Memory Monitor, and the Custom Device Skin Creator in this release. All Java ME tools are implemented as NetBeans plugins. As of the plugin integrates Java ME utilities into the standard NetBeans menus. Tools > Java ME menu is the place to launch Java ME utilities, including the new Skin Creator. Profile > Java ME is the place to work with the Network Monitor and the Memory Monitor. Use the standard NetBeans tools for debugging. Profiling, Network monitoring, and Memory monitoring are integrated with the NetBeans profiling tools. New network monitoring protocols are supported in this release: WMA, SIP, Bluetooth and OBEX, SATSA APDU and JCRMI, and server sockets. Java ME SDK Update Center. Oracle Java ME SDK can be updated or extended by new components. The Update Center can download, install, and uninstall plugins specific to the Java ME SDK. A plugin consists of runtime components and skins. Bug fixes and enhancements. This version comes with a few known problems. All of them have workarounds, so I hope you don't get stuck in these issues when you are using the product. It you cannot watch static variables during an Eclipse debugging session, and sometimes the Variable view cannot show data. In the source code, move the mouse over the required variable to inspect the variable value. A real device shown in the Device Selector is deleted from the Device Manager, yet it still appears. Kill the device manager in the system tray, and relaunch it. Then you will see the device removed from the list. On-device profiling does not work on a device. CPU profiling, networking monitoring, and memory monitoring do not work on the device, since the device runtime does not yet support it. Please do the profiling with your emulator first, and then test your application on the device. In the Device Selector, using Clean Database on real external device causes a null pointer exception. External devices do not have a database recognized by the SDK, so you can disregard this exception message. Suspending the Emulator during a Memory Monitor session hangs the emulator. Do not use the Suspend option (F5) while the Memory Monitor is running. If the emulator is hung, open the Windows task manager and stop the emulator process (javaw). To switch to another application while the Memory Monitor is running, choose Application > AMS Home (F4), and select a different application. Please let us know how we can improve it even better, by sending us your feedback. -Java ME SDK Team

    Read the article

  • Difference between LASTDATE and MAX for semi-additive measures in #DAX

    - by Marco Russo (SQLBI)
    I recently wrote an article on SQLBI about the semi-additive measures in DAX. I included the formulas common calculations and there is an interesting point that worth a longer digression: the difference between LASTDATE and MAX (which is similar to FIRSTDATE and MIN – I just describe the former, for the latter just replace the correspondent names). LASTDATE is a dax function that receives an argument that has to be a date column and returns the last date active in the current filter context. Apparently, it is the same value returned by MAX, which returns the maximum value of the argument in the current filter context. Of course, MAX can receive any numeric type (including date), whereas LASTDATE only accepts a column of type date. But overall, they seems identical in the result. However, the difference is a semantic one. In fact, this expression: LASTDATE ( 'Date'[Date] ) could be also rewritten as: FILTER ( VALUES ( 'Date'[Date] ), 'Date'[Date] = MAX ( 'Date'[Date] ) ) LASTDATE is a function that returns a table with a single column and one row, whereas MAX returns a scalar value. In DAX, any expression with one row and one column can be automatically converted into the corresponding scalar value of the single cell returned. The opposite is not true. So you can use LASTDATE in any expression where a table or a scalar is required, but MAX can be used only where a scalar expression is expected. Since LASTDATE returns a table, you can use it in any expression that expects a table as an argument, such as COUNTROWS. In fact, you can write this expression: COUNTROWS ( LASTDATE ( 'Date'[Date] ) ) which will always return 1 or BLANK (if there are no dates active in the current filter context). You cannot pass MAX as an argument of COUNTROWS. You can pass to LASTDATE a reference to a column or any table expression that returns a column. The following two syntaxes are semantically identical: LASTDATE ( 'Date'[Date] ) LASTDATE ( VALUES ( 'Date'[Date] ) ) The result is the same and the use of VALUES is not required because it is implicit in the first syntax, unless you have a row context active. In that case, be careful that using in a row context the LASTDATE function with a direct column reference will produce a context transition (the row context is transformed into a filter context) that hides the external filter context, whereas using VALUES in the argument preserve the existing filter context without applying the context transition of the row context (see the columns LastDate and Values in the following query and result). You can use any other table expressions (including a FILTER) as LASTDATE argument. For example, the following expression will always return the last date available in the Date table, regardless of the current filter context: LASTDATE ( ALL ( 'Date'[Date] ) ) The following query recap the result produced by the different syntaxes described. EVALUATE     CALCULATETABLE(         ADDCOLUMNS(              VALUES ('Date'[Date] ),             "LastDate", LASTDATE( 'Date'[Date] ),             "Values", LASTDATE( VALUES ( 'Date'[Date] ) ),             "Filter", LASTDATE( FILTER ( VALUES ( 'Date'[Date] ), 'Date'[Date] = MAX ( 'Date'[Date] ) ) ),             "All", LASTDATE( ALL ( 'Date'[Date] ) ),             "Max", MAX( 'Date'[Date] )         ),         'Date'[Calendar Year] = 2008     ) ORDER BY 'Date'[Date] The LastDate columns repeat the current date, because the context transition happens within the ADDCOLUMNS. The Values column preserve the existing filter context from being replaced by the context transition, so the result corresponds to the last day in year 2008 (which is filtered in the external CALCULATETABLE). The Filter column works like the Values one, even if we use the FILTER instead of the LASTDATE approach. The All column shows the result of LASTDATE ( ALL ( ‘Date’[Date] ) ) that ignores the filter on Calendar Year (in fact the date returned is in year 2010). Finally, the Max column shows the result of the MAX formula, which is the easiest to use and only don’t return a table if you need it (like in a filter argument of CALCULATE or CALCULATETABLE, where using LASTDATE is shorter). I know that using LASTDATE in complex expressions might create some issue. In my experience, the fact that a context transition happens automatically in presence of a row context is the main reason of confusion and unexpected results in DAX formulas using this function. For a reference of DAX formulas using MAX and LASTDATE, read my article about semi-additive measures in DAX.

    Read the article

  • I have Ubuntu alongside Windows Vista and I cannot reboot Windows Vista

    - by railguage48
    I cannot get into Windows Vista .... I was working in Vista and then I restarted booted up Ubuntu and when I finished in Ubuntu I restarted this time in Vista and all I get is the microsoft box with the vertical stripes running interminably. I ran sudo update-grub this is the result of that command: generating grub.cfg found linux image: /boot/vmlinuz-3.2.0-24-generic found initrd image: /boot/iniytd.img-3.2.0-24-generic found linux image: /boot/vmlinuz-3.0.0-19-generic found linitrd image: /boot/initrd.img-3.0.0-19-generic found windows recovery environment (loader) on /dev/sda1 skipping windows recovery environment (loader) on Wubi system found windows vista (loader) on /dev/sda2 skipping windows vista (loader) on wubi system I do have a backup of my Windows environment on an external hard drive and I can get to it through ubuntu but I am not sure if I can restore Windows Vista from Ubuntu or even if I need to. Thanks for any help.

    Read the article

  • Project Dashboards

    - by EightyEight
    I'm attempting to create a dashboard so that people not intimately involved with the project can get an indication of project's health. I'm struggling with determining what to put on said dashboard. I think it needs to be brief to be useful, yet complete. The project I'm working on depends on both 3rd party contractors, external hardware and of course my team's effort. Are there any suggestions or guidelines on how to encapsulate it all in a relatively easy manner? Mods, I believe this question falls squarely between development methodologies and business concerns as outlined in the faq. Thank you!

    Read the article

  • What's a good open source java project for students to hack on?

    - by Evan Grim
    I'm working with a professor to develop a course teaching practical software development tools and methodology. We're looking for a sample code base that we can use for hands-on experience in each of the topics and as the basis for a semester-long project where students will work in a team to implement a feature or fix bugs. Here are some basic guidelines for the project that we'd like to come close to meeting: java based, ~50K SLOC, uses ant, depends upon some external library, has a test suite (preferably jUnit), friendly for development within eclipse, actively developed with a substantial history available within a version control system (such as subversion), the more "coolness" factor the better (to motivate the students), and preferably with some kind of user interface (e.g.: not just a library).

    Read the article

  • Minidlna Directory Issues

    - by Somnambulist
    I've done my searching and can't find an answer to THIS specific issue. I have my minidlna set up and running - but it's not really done properly. First off, when I open the server on my bluray player, all of my movies are listed twice - when they are certainly not saved on my external twice. Second, when I open the server - rather than reading "Movies" "TV" "Music", etc - It just mashes all of my movies, tv, and some other folders all together with no real organization. I never had this problem when I had my Windows set up, so I know it's something configured improperly more-so than my external drive giving me gruff. Here's my minidlna.conf file: # This is the configuration file for the MiniDLNA daemon, a DLNA/UPnP-AV media # server. # # Unless otherwise noted, the commented out options show their default value. # # On Debian, you can also refer to the minidlna.conf(5) man page for # documentation about this file. media_dir=/media/somnambulist/Ghost In You # This option can be specified more than once if you want multiple directories # scanned. # # If you want to restrict a media_dir to a specific content type, you can # prepend the directory name with a letter representing the type (A, P or V), # followed by a comma, as so: # * "A" for audio (eg. media_dir=A,/var/lib/minidlna/music) # * "P" for pictures (eg. media_dir=P,/var/lib/minidlna/pictures) # * "V" for video (eg. media_dir=V,/var/lib/minidlna/videos) # # WARNING: After changing this option, you need to rebuild the database. Either # run minidlna with the '-R' option, or delete the 'files.db' file # from the db_dir directory (see below). # On Debian, you can run, as root, 'service minidlna force-reload' instead. #media_dir=/var/lib/minidlna media_dir=V,/media/somnambulist/Ghost In You/Movies media_dir=V,/media/somnambulist/Ghost In You/TV media_dir=P,/home/somnambulist/Pictures # Path to the directory that should hold the database and album art cache. db_dir=/home/somnambulist/serverart # Path to the directory that should hold the log file. log_dir=/home/somnambulist/serverlog # Minimum level of importance of messages to be logged. # Must be one of "off", "fatal", "error", "warn", "info" or "debug". # "off" turns of logging entirely, "fatal" is the highest level of importance # and "debug" the lowest. #log_level=warn # Use a different container as the root of the directory tree presented to # clients. The possible values are: # * "." - standard container # * "B" - "Browse Directory" # * "M" - "Music" # * "P" - "Pictures" # * "V" - "Video" # if you specify "B" and client device is audio-only then "Music/Folders" will be used as root root_container=B # Network interface(s) to bind to (e.g. eth0), comma delimited. #network_interface= # IPv4 address to listen on (e.g. 192.0.2.1). #listening_ip= # Port number for HTTP traffic (descriptions, SOAP, media transfer). port=8200 # URL presented to clients. # The default is the IP address of the server on port 80. #presentation_url=http://example.com:80 # Name that the DLNA server presents to clients. friendly_name=Somnambulist Media Server # Serial number the server reports to clients. serial=12345678 # Model name the server reports to clients. #model_name=Windows Media Connect compatible (MiniDLNA) # Model number the server reports to clients. model_number=1 # Automatic discovery of new files in the media_dir directory. #inotify=yes # List of file names to look for when searching for album art. Names should be # delimited with a forward slash ("/"). album_art_names=Cover.jpg/cover.jpg/AlbumArtSmall.jpg/albumartsmall.jpg/AlbumArt.jpg/albumart.jpg/Album.jpg/album.jpg/Folder.jpg/folder.jpg/Thumb.jpg/thumb.jpg # Strictly adhere to DLNA standards. # This allows server-side downscaling of very large JPEG images, which may # decrease JPEG serving performance on (at least) Sony DLNA products. #strict_dlna=no # Support for streaming .jpg and .mp3 files to a TiVo supporting HMO. #enable_tivo=no # Notify interval, in seconds. #notify_interval=895 # Path to the MiniSSDPd socket, for MiniSSDPd support. #minissdpdsocket=/run/minissdpd.sock` And here's the error I get in terminal when I run: sudo service minidlna restart sudo service minidlna force-reload Force restart error: Restarting DLNA/UPnP-AV media server minidlna [2013/08/12 21:19:27] minidlna.c:474: error: Media directory "/media/somnambulist/Ghost In You/Movies" not accessible! [Permission denied] [2013/08/12 21:19:27] minidlna.c:474: error: Media directory "/media/somnambulist/Ghost In You/TV" not accessible! [Permission denied] Force-reload error: Restarting DLNA/UPnP-AV media server minidlna [2013/08/12 21:19:46] minidlna.c:474: error: Media directory "/media/somnambulist/Ghost In You/Movies" not accessible! [Permission denied] [2013/08/12 21:19:46] minidlna.c:474: error: Media directory "/media/somnambulist/Ghost In You/TV" not accessible! [Permission denied] rm: cannot remove ‘/home/somnambulist/serverart/files.db’: Permission denied rm: cannot remove ‘/home/somnambulist/serverart/art_cache/media/somnambulist/Ghost In You/Movies/Slumdog Millionaire/Slumdog.Millionaire.Cover.jpg’: Permission denied rm: cannot remove ‘/home/somnambulist/serverart/art_cache/media/somnambulist/Ghost In You/Movies/Zack and Miri Make a Porno/ZackAndMiriMakeAPornoCover.jpg’: Permission denied [2013/08/12 21:19:46] minidlna.c:744: warn: Failed to clean old file cache. [ OK ] I've spent hours on this at this point, read through various files - and even had a friend who is relatively Ubuntu-savvy try to help me via chat - no such luck. Thanks in advance for any help.

    Read the article

  • Regular Expressions. Remember it, write it, test it.

    - by outcoldman
    I should say that I’m fan of regular expressions. Whenever I see the problem, which I can solve with Regex, I felt a burning desire to do it and going to write new test for new regex. Previously I had installed SharpDevelop Studio just for good regular expression tool in it (Why VS doesn’t have one?). But now I’m a little wiser, and for each Regex I write a separate test. I find it difficult to remember the syntax of regular expressions (I don’t write them very often); I always forget which character is responsible for the beginning of the line, etc. So I use external small and easy articles like this “Regular expressions - An introduction”. Now I want to show you little samples of regular expressions and want to show you how to test these samples. Read more... (redirect to http://outcoldman.ru)

    Read the article

  • Where do we put "asking the world" code when we separate computation from side effects?

    - by Alexey
    According to Command-Query Separation principle, as well as Thinking in Data and DDD with Clojure presentations one should separate side effects (modifying the world) from computations and decisions, so that it would be easier to understand and test both parts. This leaves an unanswered question: where relatively to the boundary should we put "asking the world"? On the one hand, requesting data from external systems (like database, extental services' APIs etc) is not referentially transparent and thus should not sit together with pure computational and decision making code. On the other hand, it's problematic, or maybe impossible to tease them apart from computational part and pass it as an argument as because we may not know in advance which data we may need to request.

    Read the article

  • Can't download web photo albums to Picasa

    - by Arcadie
    Someone has shared a Picasa web album (Limited, anyone with the link), but I can't download it to Picasa. The following alert appears: Firefox doesn't know how to open this address, because the protocol (picasa) isn't associated with any program. I have Picasa 3.0.0 installed on Ubuntu 11.04, I remember it saying something about registering the picasa protocol with Firefox during the installation. I have Firefox 6.0.2, and these settings are present in about:config network.protocol-handler.app.picasa;/usr/bin/picasa network.protocol-handler.expose.picasa;true network.protocol-handler.external.picasa;true Picasa is located here: $ which picasa /usr/bin/picasa Is there something I can do to make this work? PS: I hope this is not off-topic here, and I can't find the "picasa" tag. Could someone please add it, if appropriate?

    Read the article

  • Deleting Team Project in Team Foundation Server 2010

    - by Hosam Kamel
    I’m seeing a lot of people still using some old ways ported from TFS 2008 to delete a team project like TFSDeleteProject utility.   In TFS 2010 the administration tasks are made very easy to help you in a lot of administration stuff, for the deletion point specially you can navigate to the Administration Console then Select Team Project Collection Select the project collection contains the project you want to delete then navigate to Team Projects. Select the project then click Delete, you will have the option to delete any external artifacts and workspace too.   Hope it helps. Originally posted at "Hosam Kamel| Developer & Platform Evangelist"

    Read the article

  • The JRockit Performance Counters

    - by Marcus Hirt
    Every now and then I get a question regarding what the attributes in the PerfCounters dynamic MBean represent. Now, all the MBeans under the oracle.jrockit.management (bea.jrockit.management pre R28) domain are part of what we call JMXMAPI (the JRockit JMX based Management API), which is unsupported. Therefore there is no official documentation for the API. I did however write a bit about JMXMAPI in my recent JRockit book, Oracle JRockit: The Definitive Guide. The information in the table below is from that book: Counter Description java.cls.loadedClasses The number of classes loaded since the start of the JVM. java.cls.unloadedClasses The number of classes unloaded since the start of the JVM. java.property.java.class.path The class path of the JVM. java.property.java.endorsed.dirs The endorsed dirs. See the Endorsed Standards Override Mechanism. java.property.java.ext.dirs The ext dirs, which are searched for jars that should be automatically put on the classpath. See the Java documentation for java.ext.dirs. java.property.java.home The root of the JDK or JRE installation. java.property.java.library.path The library path used to find user libraries. java.property.java.vm.version The JRockit version. java.rt.vmArgs The list of VM arguments. java.threads.daemon The number of running daemon threads. java.threads.live The total number of running threads. java.threads.livePeak The peak number of threads that has been running since JRockit was started. java.threads.nonDaemon The number of non-daemon threads running. java.threads.started The total number of threads started since the start of JRockit. jrockit.gc.latest.heapSize The current heap size in bytes. jrockit.gc.latest.nurserySize The current nursery size in bytes. jrockit.gc.latest.oc.compaction.time How long, in ticks, the last compaction lasted. Reset to 0 if compaction is skipped. jrockit.gc.latest.oc.heapUsedAfter Used heap at the end of the last OC, in bytes. jrockit.gc.latest.oc.heapUsedBefore Used heap at the start of the last OC, in bytes. jrockit.gc.latest.oc.number The number of OCs that have occurred so far. jrockit.gc.latest.oc.sumOfPauses The paused time for the last OC, in ticks. jrockit.gc.latest.oc.time The time the last OC took, in ticks. jrockit.gc.latest.yc.sumOfPauses The paused time for the last YC, in ticks. jrockit.gc.latest.yc.time The time the last YC took, in ticks. jrockit.gc.max.oc.individualPause The longest OC pause so far, in ticks. jrockit.gc.max.yc.individualPause The longest YC pause so far, in ticks. jrockit.gc.total.oc.compaction.externalAborted Number of aborted external compactions so far. jrockit.gc.total.oc.compaction.internalAborted Number of aborted internal compactions so far. jrockit.gc.total.oc.compaction.internalSkipped Number of skipped internal compactions so far. jrockit.gc.total.oc.compaction.time The total time spent doing compaction so far, in ticks. jrockit.gc.total.oc.ompaction.externalSkipped Number of skipped external compactions so far. jrockit.gc.total.oc.pauseTime The sum of all OC pause times so far, in ticks. jrockit.gc.total.oc.time The total time spent doing OC so far, in ticks. jrockit.gc.total.pageFaults The number of page faults that have occurred during GC so far. jrockit.gc.total.yc.pauseTime The sum of all YC pause times, in ticks. jrockit.gc.total.yc.promotedObjects The number of objects that all YCs have promoted. jrockit.gc.total.yc.promotedSize The total number of bytes that all YCs have promoted, in bytes. jrockit.gc.total.yc.time The total time spent doing YC, in ticks. oracle.ci.jit.count The number of methods JIT compiled. oracle.ci.jit.timeTotal The total time spent JIT compiling, in ticks. oracle.ci.opt.count The number of methods optimized. oracle.ci.opt.timeTotal The total time spent optimizing, in ticks. oracle.rt.counterFrequency Used to convert ticks values to seconds. Note that many of these counters are excellent choices for attributes to plot in the Management Console. Also note that many values are in ticks – to convert them to seconds, divide by the value in the oracle.rt.counterFrequency counter.

    Read the article

  • Configuring Expert Search in Communicator 14 and SharePoint 2010

    Communicator 14 provides functionality to be able to search for contacts not just by name, but by skill.  For example a customer service agent at an airline can search for other agents with Travel Advisory experience by typing the search criteria into the Communicator search box and performing a search by keyword.  The search results will return users who have specified that skill in their profile on their SharePoint My Site.  This is actually pretty easy to configure, Ill show you how. Create Search and People Search Results Pages in SharePoint Communicator 14 Expert Search works by using the SharePoint 2010 Search Service to search SharePoint for user profiles with matching keywords.  This requires that you have an Enterprise Search site in your site collection which includes the search service and also the People Results pages.  The easiest way to do this is to create a Search Center site in your site collection. Note: I get an error when trying to create an Enterprise Search site in a Team Site in the SharePoint 2010 RTM bits, so I created it as a site collection that is evident in the URLs you see below. In the screenshots below, you can see that the URL of the SharePoint search service in the Search site collection is http://sps2010/sites/search/_vti_bin/search.asmx, and the URL of the People Search Results page is http://sps2010/sites/Search/Pages/peopleresults.aspx. Point Communications Server 14 to Search and People Search Results Pages For Communicator 14 to be able to perform an Expert Search, you need to configure Communications Server 14 to point to the Search Service and People Search Results page URLs. From a server with the OCS Core bits installed, fire up the Communications Server Management Shell and type Get-CsClientPolicy. Scroll down to the bottom of the output, were interested in setting the values of: SPSearchInternalURL SPSearchExternalURL SPSearchCenterInternalURL SPSearchCenterExternalURL SPSearchInternalURL and SPSearchExternalURL correspond to the internal and external URLs of the SharePoint search service in the Search site collection, while SPSearchCenterInternalURL and SPSearchCenterExternalURL correspond to the internal and external URLs of the people search results pages. Well use the Communications Server Management Shell to set the values of these CS policy properties. For simplicity, Im only going to set the internal URLs here. Set-CsClientPolicy SPSearchInternalURL http://sps2010/sites/search/_vti_bin/search.asmx     -SPSearchCenterInternalURL http://sps2010/sites/Search/Pages/peopleresults.aspx Log out and back into Communicator.  You can verify that these settings were applied by running the Get-CsClientPolicy cmdlet again from the Communications Server Management Shell. However, theres another super-secret ninja trick to verify that the settings were applied: Find the Communicator icon in the Windows System Tray Hold down the Ctrl button Click (left) the Communicator icon in the Windows System Tray do not depress the Ctrl button You should now see an extra menu item called Configuration Information, click it. Scroll down and locate the Expert Search URL and SharePoint Search Center URL keys and verify that their values correspond to those you set using the Set-CsClientPolicy PowerShell cmdlet. Configure a Sharepoint User Profile Import Im not going to provide detailed steps here except to say that you need to configure the SharePoint 2010 User Profile  Service Application to import user account details from Active Directory on a scheduled basis. This is a critical step because there are several user profile properties e.g. SipAddress that are only populated by a user profile import.  When performing an Expert Search, Communicator can only render results for users who have a SipAddress specified. Add Skills to User Profiles Navigate to your My Site and click on My Profile.  This page allows you to set many contact details that are searchable in SharePoint.  Were particularly interested in the Ask Me About property of a users profile.  Expert Search searches against this property to find users with matching skills. Configure a SharePoint Search Crawl Ensure that you have a scheduled job to crawl your Local SharePoint Sites content source.  Depending on how you have this configured, it will also crawl the My Site site collection and add user properties such as Ask Me About to the search index. Thats It! SharePoint 2010 provides new social and collaboration features to help users find other users with similar skills or interests. Expert Search extends this functionality directly into Microsoft Communicator 14, allowing you to interact with the users directly from the search results. Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Thanks to NxtGenUG Manchester - Hyper-V for Developers presentation now available for download

    - by Liam Westley
    Thanks to Steve and Andy at NxtGenUG Manchester for making me very welcome and for the guys who didn't head down the pub for a Guinness for St Patrick's Day and came to NxtGen instead.  I hope you all got something from the presentation, if not technical insights, at least a can of Guinness of Tunnocks caramel wafer as swag. As promised here is the presentation in both PowerPoint and Adobe PDF format (with speaker notes), http://www.tigernews.co.uk/blog-twickers/nxtgenugmanc/hyperv4devs-ppt.zip http://www.tigernews.co.uk/blog-twickers/nxtgenugmanc/hyperv4devs-pdf.zip Since I gave the presentation Microsoft has released XP Mode (Windows Virtual PC for use under Windows 7) without the requirement for hardware virtualisation. Read more about that here, http://blogs.msdn.com/Virtual_PC_Guy/ For anyone who has seen this presentation at other user groups, there is a new section at the end of the presentation dealing with the various networking configurations under Hyper-V; not connected, private network, internal network and external network.  This includes details of what these mean, and a Venn diagram to aid understanding of the implications.

    Read the article

  • Adding additional users to Ubuntu server and configuring Samba

    - by Ben
    I have installed Ubuntu Server 12.10 and during the install created a user for myself ben, I now wish to add a second user bill. I have an external drive that I have mounted to /media/storage and created a shared folder called Share, the owner of the folder is ben:ben, how do I grant bill access to the folder? I don't want to put him in the group ben. Once setup, I need to configure Samba & NFS, here is my Samba configuration [Share] path = /media/storage/Share read only = no public = yes writeable = yes force user = ben How do I give both bill and ben access to the share via Samba? Thank you.

    Read the article

  • Technical Screencast Series

    - by Ben Griswold
    Noah and I have started to produce a series of technical screencasts. In the spirit of Dimecasts.net, we’re limiting each episode to ten minutes as we thought the development community could benefit from short, focused episodes. We’re just getting started, but I’m really pleased with our progress and I’m very excited about what’s to come.  The first three episodes are focused on the .NET stack (specifically around Visual Studio Solution Setup, Managing .NET External Dependencies and Working with the ASP.NET Membership Provider) but since we work for a mixed shop of .NET and Java development, I’m sure we’ll eventually introduce all sorts of topics. We’re currently putting together a list of shows. If you have suggestions, please let me know. I plan to post the episodes to johnnycoder as they roll out and who knows?  Maybe your screencast idea will show up next.

    Read the article

  • SSIS Snack: Data Flow Source Adapters

    - by andyleonard
    Introduction Configuring a Source Adapter in a Data Flow Task couples (binds) the Data Flow to an external schema. This has implications for dynamic data loads. "Why Can't I...?" I'm often asked a question similar to the following: "I have 17 flat files with different schemas that I want to load to the same destination database - how many Data Flow Tasks do I need?" I reply "17 different schemas? That's easy, you need 17 Data Flow Tasks." In his book Microsoft SQL Server 2005 Integration Services...(read more)

    Read the article

  • Oracle Consulting North America is now live on PeopleSoft Services Procurement and PeopleSoft Resource Management

    - by Howard Shaw
    Last month, Oracle's own internal consulting group (OCS North America) went live on PeopleSoft Services Procurement and PeopleSoft Resource Management to manage all aspects of identifying, recruiting, and deploying billable subcontractors on North America Applications customer consulting projects. The primary goals were to enhance the subcontractor staffing process, improve operational and informational processes, and improve collaboration between the Oracle NA Consulting Subcontractor Program and subcontractor suppliers. Over 200 registered external suppliers access the tool, review open needs and competitively bid their resources to work on NA Applications projects. This implementation highlights the usage of Oracle’s own solutions to streamline and enhance business operations, as the PeopleSoft 9.1 applications (Services Procurement and Resource Management) were deployed using Sun hardware, Oracle Enterprise Linux, and Oracle Virtual Machines.For more information, please navigate to the following web pages: PeopleSoft Services Procurement PeopleSoft Resource Management

    Read the article

  • How do I recover a BTRFS filesystem with "parent transid verify failed" errors?

    - by Evan P.
    I've got an external USB disk running btrfs. I use it for backups, and each time I do a backup I take a snapshot. However, it's giving me this error now: parent transid verify failed on 109973766144 wanted 1823 found 1821 parent transid verify failed on 109973766144 wanted 1823 found 1821 Obviously, this is a non-critical disk, but I have a few files on here that aren't available elsewhere. Is there any way to recover data from this disk? Maybe by mounting one of the snapshots as root?

    Read the article

  • Introducing the ADF Desktop Integration Troubleshooting Guide

    - by Juan Camilo Ruiz
    Since the addition of ADF Desktop Integration to the ADF Framework, a number of customers internal and external, have started extending their applications in use cases around integration with MS Excel. In an effor to share the knowledge collected since the product came out, we are happy to launch the ADF Desktop Integration Troubleshooting Guide, where usera can find an active collection of best practices to figuring out how to best approach issues while using ADF Desktop Integration. Be sure to bookmark this link and make sure to check it out, plenty of scenarios are covered and more will be added as we continue identifying them. 

    Read the article

< Previous Page | 193 194 195 196 197 198 199 200 201 202 203 204  | Next Page >