Search Results

Search found 29245 results on 1170 pages for 'self service'.

Page 502/1170 | < Previous Page | 498 499 500 501 502 503 504 505 506 507 508 509  | Next Page >

  • Friday Fun: Sydney Shark

    - by Asian Angel
    Another long week is finally coming to an end, so why not have a little fun to finish the week out before going home? In today’s game you become a powerful shark determined to turn the Sydney coastline into one long smorgasbord while causing as much mayhem and destruction as possible along the way. Note: You will most likely encounter a small video ad as the game is loading and then again when your final score is displayed. Both come with a small clickable link for skipping the ads. Latest Features How-To Geek ETC Should You Delete Windows 7 Service Pack Backup Files to Save Space? What Can Super Mario Teach Us About Graphics Technology? Windows 7 Service Pack 1 is Released: But Should You Install It? How To Make Hundreds of Complex Photo Edits in Seconds With Photoshop Actions How to Enable User-Specific Wireless Networks in Windows 7 How to Use Google Chrome as Your Default PDF Reader (the Easy Way) WizMouse Enables Mouse Over Scrolling on Any Window Enhance GIMP’s Image Editing Power with Gimp Paint Studio Reclaim Vertical UI Space by Moving Your Tabs to the Side in Firefox Wind and Water: Puzzle Battles – An Awesome Game for Linux and Windows How Star Wars Changed the World [Infographic] Tabs Visual Manager Adds Thumbnailed Tab Switching to Chrome

    Read the article

  • Partner Webcast - Oracle SOA Suite 12c: Connect 4 Cloud, Mobile, IoT with On-premise

    - by Thanos Terentes Printzios
    The pace of new business projects continues to grow from increasing customer self-service to seamlessly connecting all your back office and in-the-field applications. At the same time increased integration complexity may seem inevitable as organizations are suddenly faced with the requirement to support three new integration challenges:  » Cloud Integration - integrate with the cloud, rapidly integrate a growing list of cloud applications with existing applications » Mobile Integration - the urgency to mobile-enable existing applications » IoT Integration - begin development on the latest trend of connecting Internet of Things (IoT) devices to your existing infrastructure. Oracle SOA Suite 12c, the latest version of the industry’s most complete and unified application integration and SOA solution, aims to simplify, accelerate and optimize integrations. Oracle SOA Suite 12c and its associated products, Oracle Managed File Transfer, Oracle Cloud and Application Adapters, B2B and healthcare integration, offer the industry’s most highly integrated platform for solving the increased integration challenges. Oracle SOA Suite 12c is a complete, integrated and best-of-breed platform. It enables next generation integration capabilities through: · A unified toolset for the development of services and composite applications.· A standards-based platform that is service enabled and easily consumable by modern web applications, allowing enterprises to quickly and easily adapt to changes in their business and IT environments.· Greater visibility, controls and analytics to govern how services and processes are deployed, reused and changed across their entire lifecycle. Join us to find out more about the new features of Oracle SOA Suite 12c and how it enables you to reduce time to market for new project integration and to reduce integration cost and complexity. Oracle SOA Suite is the ability to simplify by integrating the disparate requirements of cloud, mobile, and IoT devices with existing on-premise applications. Agenda: Oracle SOA Suite 12c new Features Cloud Integration Mobile Enablement Internet of Things (IoT) Summary - Q&A Delivery Format This FREE online LIVE eSeminar will be delivered over the Web. Registrations received less than 24hours prior to start time may not receive confirmation to attend. Presenter: Heba Fouad – FMW Specialist, Technology Adoption, ECEMEA Partner Business Development Date: Thursday, August 28th, 10pm CEST (8am UTC/11am EEST)Duration: 1 hour Register Here For any questions please contact us at partner.imc-AT-beehiveonline.oracle-DOT-com

    Read the article

  • Hack Apart a Highlighter to Create UV-Reactive Flowers [Science]

    - by ETC
    College students have long been hacking apart highlighters to create glowing bottles of booze to line their dorm room walls. Far more interesting, however, is the application of the hack to flowers. Many of you may remember a science class experiment from years gone by where in you put food coloring in a beaker and then some freshly cut white flowers; returning to the experiment a day later yielded flowers colored to match the dye you added. This little experiment relies on the same technique, only instead of blue food coloring the flowers suck up UV-reactive highlighter dye. Check out the video below to see the experiment in action: Have a fun science experiment to share? Let’s hear about it in the comments. Make Flowers Glow in the Dark (with Highlighter Fluid and UV Light) [YouTube via Make] Latest Features How-To Geek ETC Learn To Adjust Contrast Like a Pro in Photoshop, GIMP, and Paint.NET Have You Ever Wondered How Your Operating System Got Its Name? Should You Delete Windows 7 Service Pack Backup Files to Save Space? What Can Super Mario Teach Us About Graphics Technology? Windows 7 Service Pack 1 is Released: But Should You Install It? How To Make Hundreds of Complex Photo Edits in Seconds With Photoshop Actions Hack Apart a Highlighter to Create UV-Reactive Flowers [Science] Add a “Textmate Style” Lightweight Text Editor with Dropbox Syncing to Chrome and Iron Is the Forcefield Really On or Not? [Star Wars Parody Video] Google Updates Picasa Web Albums; Emphasis on Sharing and Showcasing Uwall.tv Turns YouTube into a Video Jukebox Early Morning Sunrise at the Beach Wallpaper

    Read the article

  • NTPD issue - syncs then slowly loses ground

    - by ethrbunny
    RHEL 5 workstation. Has been running smoothly for years. I did a 'pup' recently and followed with a nice, cleansing reboot. Afterwards the system had some startup issues: namely MySQL refused to start. It just went "...." for 5-10 minutes before I did another boot and skipped that step (using 'interactive'). This was the only service that didn't wan't to start normally. So now that the system is booted I've found that it doesn't want to stay in sync with the NTP master and after 48 hours is refusing any SSH other than root. NTPD: this service starts normally and gets a lock on 4 servers. Almost immediately it starts to lose ground and now (after 3 days) is almost 40 hours behind. If I stop/start the service it gets the lock, resets the system clock and starts losing ground again. The 'hwclock' is set properly and maintains its time. Login: when I (re)start the ntp server I am able to login normally. I assume this problem is due to losing sync with LDAP. This appears to be verified by LDAP errors in /var/log/messages. Suggestions on where to look? ADDENDA: Tried deleting the 'drift' file. After a bit it gets recreated with 0.000. from /var/log/messages: Jan 17 06:54:01 aeolus ntpdate[5084]: step time server 129.95.96.10 offset 30.139216 sec Jan 17 06:54:01 aeolus ntpd[5086]: ntpd [email protected] Tue Oct 25 12:54:17 UTC 2011 (1) Jan 17 06:54:01 aeolus ntpd[5087]: precision = 1.000 usec Jan 17 06:54:01 aeolus ntpd[5087]: Listening on interface wildcard, 0.0.0.0#123 Disabled Jan 17 06:54:01 aeolus ntpd[5087]: Listening on interface wildcard, ::#123 Disabled Jan 17 06:54:01 aeolus ntpd[5087]: Listening on interface lo, ::1#123 Enabled Jan 17 06:54:01 aeolus ntpd[5087]: Listening on interface eth0, fe80::213:72ff:fe20:4080#123 Enabled Jan 17 06:54:01 aeolus ntpd[5087]: Listening on interface lo, 127.0.0.1#123 Enabled Jan 17 06:54:01 aeolus ntpd[5087]: Listening on interface eth0, 10.127.24.81#123 Enabled Jan 17 06:54:01 aeolus ntpd[5087]: kernel time sync status 0040 Jan 17 06:54:02 aeolus ntpd[5087]: frequency initialized 0.000 PPM from /var/lib/ntp/drift Jan 17 06:54:02 aeolus ntpd[5087]: system event 'event_restart' (0x01) status 'sync_alarm, sync_unspec, 1 event, event_unspec' (0xc010) You can see the 30 second offset. This was after about one minute of operation.

    Read the article

  • Establishing WebLogic Server HTTPS Trust of IIS Using a Microsoft Local Certificate Authority

    - by user647124
    Everyone agrees that self-signed and demo certificates for SSL and HTTPS should never be used in production and preferred not to be used elsewhere. Most self-signed and demo certificates are provided by vendors with the intention that they are used only to integrate within the same environment. In a vendor’s perfect world all application servers in a given enterprise are from the same vendor, which makes this lack of interoperability in a non-production environment an advantage. For us working in the real world, where not only do we not use a single vendor everywhere but have to make do with self-signed certificates for all but production, testing HTTPS between an IIS ASP.NET service provider and a WebLogic J2EE consumer application can be very frustrating to set up. It was for me, especially having found many blogs and discussion threads where various solutions were described but did not quite work and were all mostly similar but just a little bit different. To save both you and my future (who always seems to forget the hardest-won lessons) all of the pain and suffering, I am recording the steps that finally worked here for reference and sanity. How You Know You Need This The first cold clutches of dread that tells you it is going to be a long day is when you attempt to a WSDL published by IIS in WebLogic over HTTPS and you see the following: <Jul 30, 2012 2:51:31 PM EDT> <Warning> <Security> <BEA-090477> <Certificate chain received from myserver.mydomain.com - 10.555.55.123 was not trusted causing SSL handshake failure.> weblogic.wsee.wsdl.WsdlException: Failed to read wsdl file from url due to -- javax.net.ssl.SSLKeyException: [Security:090477]Certificate chain received from myserver02.mydomain.com - 10.555.55.123 was not trusted causing SSL handshake failure. The above is what started a three day sojourn into searching for a solution. Even people who had solved it before would tell me how they did, and then shrug when I demonstrated that the steps did not end in the success they claimed I would experience. Rather than torture you with the details of everything I did that did not work, here is what finally did work. Export the Certificates from IE First, take the offending WSDL URL and paste it into IE (if you have an internal Microsoft CA, you have IE, even if you don’t use it in favor of some other browser). To state the semi-obvious, if you received the error above there is a certificate configured for the IIS host of the service and the SSL port has been configured properly. Otherwise there would be a different error, usually about the site not found or connection failed. Once the WSDL loads, to the right of the address bar there will be a lock icon. Click the lock and then click View Certificates in the resulting dialog (if you do not have a lock icon but do have a Certificate Error message, see http://support.microsoft.com/kb/931850 for steps to install the certificate then you can continue from the point of finding the lock icon). Figure 1: View Certificates in IE Next, select the Details tab in the resulting dialog Figure 2: Use Certificate Details to Export Certificate Click Copy to File, then Next, then select the Base-64 encoded option for the format Figure 3: Select the Base-64 encoded option for the format For the sake of simplicity, I choose to save this to the root of the WebLogic domain. It will work from anywhere, but later you will need to type in the full path rather than just the certificate name if you save it elsewhere. Figure 4: Browse to Save Location Figure 5: Save the Certificate to the Domain Root for Convenience This is the point where I ran into some confusion. Some articles mentioned exporting the entire chain of certificates. This supposedly works for some types of certificates, or if you have a few other tools and the time to learn them. For the SSL experts out there, they already have these tools, know how to use them well, and should not be wasting their time reading this article meant for folks who just want to get things wired up and back to unit testing and development. For the rest of us, the easiest way to make sure things will work is to just export all the links in the chain individually and let WebLogic Server worry about re-assembling them into a chain (which it does quite nicely). While perhaps not the most elegant solution, the multi-step process is easy to repeat and uses only tools that are immediately available and require no learning curve. So… Next, go to Tools then Internet Options then the Content tab and click Certificates. Go to the Trust Root Certificate Authorities tab and find the certificate root for your Microsoft CA cert (look for the Issuer of the certificate you exported earlier). Figure 6: Trusted Root Certification Authorities Tab Export this one the same way as before, with a different name Figure 7: Use a Unique Name for Each Certificate Repeat this once more for the Intermediate Certificate tab. Import the Certificates to the WebLogic Domain Now, open an command prompt, navigate to [WEBLOGIC_DOMAIN_ROOT]\bin and execute setDomainEnv. You should then be in the root of the domain. If not, CD to the domain root. Assuming you saved the certificate in the domain root, execute the following: keytool -importcert -alias [ALIAS-1] -trustcacerts -file [FULL PATH TO .CER 1] -keystore truststore.jks -storepass [PASSWORD] An example with the variables filled in is: keytool -importcert -alias IIS-1 -trustcacerts -file microsftcert.cer -keystore truststore.jks -storepass password After several lines out output you will be prompted with: Trust this certificate? [no]: The correct answer is ‘yes’ (minus the quotes, of course). You’ll you know you were successful if the response is: Certificate was added to keystore If not, check your typing, as that is generally the source of an error at this point. Repeat this for all three of the certificates you exported, changing the [ALIAS-1] and [FULL PATH TO .CER 1] value each time. For example: keytool -importcert -alias IIS-1 -trustcacerts -file microsftcert.cer -keystore truststore.jks -storepass password keytool -importcert -alias IIS-2 -trustcacerts -file microsftcertRoot.cer -keystore truststore.jks -storepass password keytool -importcert -alias IIS-3 -trustcacerts -file microsftcertIntermediate.cer -keystore truststore.jks -storepass password In the above we created a new JKS key store. You can re-use an existing one by changing the name of the JKS file to one you already have and change the password to the one that matches that JKS file. For the DemoTrust.jks  that is included with WebLogic the password is DemoTrustKeyStorePassPhrase. An example here would be: keytool -importcert -alias IIS-1 -trustcacerts -file microsoft.cer -keystore DemoTrust.jks -storepass DemoTrustKeyStorePassPhrase keytool -importcert -alias IIS-2 -trustcacerts -file microsoftRoot.cer -keystore DemoTrust.jks -storepass DemoTrustKeyStorePassPhrase keytool -importcert -alias IIS-2 -trustcacerts -file microsoftInter.cer -keystore DemoTrust.jks -storepass DemoTrustKeyStorePassPhrase Whichever keystore you use, you can check your work with: keytool -list -keystore truststore.jks -storepass password Where “truststore.jks” and “password” can be replaced appropriately if necessary. The output will look something like this: Figure 8: Output from keytool -list -keystore Update the WebLogic Keystore Configuration If you used an existing keystore rather than creating a new one, you can restart your WebLogic Server and skip the rest of this section. For those of us who created a new one because that is the instructions we found online… Next, we need to tell WebLogic to use the JKS file (truststore.jks) we just created. Log in to the WebLogic Server Administration Console and navigate to Servers > AdminServer > Configuration > Keystores. Scroll down to “Custom Trust Keystore:” and change the value to “truststore.jks” and the value of “Custom Trust Keystore Passphrase:” and “Confirm Custom Trust Keystore Passphrase:” to the password you used when earlier, then save your changes. You will get a nice message similar to the following: Figure 9: To Be Safe, Restart Anyways The “No restarts are necessary” is somewhat of an exaggeration. If you want to be able to use the keystore you may need restart the server(s). To save myself aggravation, I always do. Your mileage may vary. Conclusion That should get you there. If there are some erroneous steps included for your situation in particular, I will offer up a semi-apology as the process described above does not take long at all and if there is one step that could be dropped from it, is still much faster than trying to figure this out from other sources.

    Read the article

  • MySQL Connect in 4 Days - Sessions From Users and Customers

    - by Bertrand Matthelié
    72 1024x768 Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Cambria","serif";} Let’s review today the conference sessions where users and customers will describe their use of MySQL as well as best practices. Remember you can plan your schedule with Schedule Builder. Saturday, 11.30 am, Room Golden Gate 7: MySQL and Hadoop—Chris Schneider, Ning.com Saturday, 1.00 pm, Room Golden Gate 7: Thriving in a MySQL Replicated World—Ed Presz and Andrew Yee, Ticketmaster Saturday, 1.00 pm, Room Golden Gate 8: Rick’s RoTs (Rules of Thumb)—Rick James, Yahoo! Saturday, 2.30 pm, Room Golden Gate 3: Scaling Pinterest—Yashwanth Nelapati and Evrhet Milam, Pinterest Saturday, 4.00 pm, Room Golden Gate 3: MySQL Pool Scanner: An Automated Service for Host Management—Levi Junkert, Facebook Sunday, 10.15 am, Room Golden Gate 3: Big Data Is a Big Scam (Most of the Time)—Daniel Austin, PayPal Sunday, 11.45 am, Room Golden Gate 3: MySQL at Twitter: Development and Deployment—Jeremy Cole and Davi Arnaut, Twitter Sunday, 1.15 pm, Room Golden Gate 3: CERN’s MySQL-as-a-Service Deployment with Oracle VM: Empowering Users—Dawid Wojcik and Eric Grancher, DBA, CERN Sunday, 2.45 pm, Room Golden Gate 3: Database Scaling at Mozilla—Sheeri Cabral, Mozilla Sunday, 5.45 pm, Room Golden Gate 4: MySQL Cluster Carrier Grade Edition @ El Chavo, Latin America’s #1 Facebook Game—Carlos Morales, Playful Play You can check out the full program here as well as in the September edition of the MySQL newsletter. Not registered yet? You can still save US$ 300 over the on-site fee – Register Now!

    Read the article

  • Hosting custom domains with IP address flexibility

    - by F21
    I am building a small service where users will be assigned a subdomain such as: myusername.myservice.com anotheruser.myservice.com I know that I can set up a wildcard vhost and using some configuration regex, serve the files like so: myusername.myservice.com ===> /var/www/myusername anotherusername.myservice.com ===> /var/www/anotherusername The problem is that I would like to allow users to alias their own domain names to their service. I understand that for the webserver, once the user adds the domain via my web interface, I can easily create a vhost for the domain in nginx and then refresh the webserver. The problem is that I would prefer to NOT let the users add an A record of my webserver's IP address as I would prefer to keep things flexible (when we upgrade our infrastructure to something more complex to scale). What is the best way to achieve this?

    Read the article

  • 10 Windows Tweaking Myths Debunked

    - by Chris Hoffman
    Windows is big, complicated, and misunderstood. You’ll still stumble across bad advice from time to time when browsing the web. These Windows tweaking, performance, and system maintenance tips are mostly just useless, but some are actively harmful. Luckily, most of these myths have been stomped out on mainstream sites and forums. However, if you start searching the web, you’ll still find websites that recommend you do these things. Erase Cache Files Regularly to Speed Things Up You can free up disk space by running an application like CCleaner, another temporary-file-cleaning utility, or even the Windows Disk Cleanup tool. In some cases, you may even see an old computer speed up when you erase a large amount of useless files. However, running CCleaner or similar utilities every day to erase your browser’s cache won’t actually speed things up. It will slow down your web browsing as your web browser is forced to redownload the files all over again, and reconstruct the cache you regularly delete. If you’ve installed CCleaner or a similar program and run it every day with the default settings, you’re actually slowing down your web browsing. Consider at least preventing the program from wiping out your web browser cache. Enable ReadyBoost to Speed Up Modern PCs Windows still prompts you to enable ReadyBoost when you insert a USB stick or memory card. On modern computers, this is completely pointless — ReadyBoost won’t actually speed up your computer if you have at least 1 GB of RAM. If you have a very old computer with a tiny amount of RAM — think 512 MB — ReadyBoost may help a bit. Otherwise, don’t bother. Open the Disk Defragmenter and Manually Defragment On Windows 98, users had to manually open the defragmentation tool and run it, ensuring no other applications were using the hard drive while it did its work. Modern versions of Windows are capable of defragmenting your file system while other programs are using it, and they automatically defragment your disks for you. If you’re still opening the Disk Defragmenter every week and clicking the Defragment button, you don’t need to do this — Windows is doing it for you unless you’ve told it not to run on a schedule. Modern computers with solid-state drives don’t have to be defragmented at all. Disable Your Pagefile to Increase Performance When Windows runs out of empty space in RAM, it swaps out data from memory to a pagefile on your hard disk. If a computer doesn’t have much memory and it’s running slow, it’s probably moving data to the pagefile or reading data from it. Some Windows geeks seem to think that the pagefile is bad for system performance and disable it completely. The argument seems to be that Windows can’t be trusted to manage a pagefile and won’t use it intelligently, so the pagefile needs to be removed. As long as you have enough RAM, it’s true that you can get by without a pagefile. However, if you do have enough RAM, Windows will only use the pagefile rarely anyway. Tests have found that disabling the pagefile offers no performance benefit. Enable CPU Cores in MSConfig Some websites claim that Windows may not be using all of your CPU cores or that you can speed up your boot time by increasing the amount of cores used during boot. They direct you to the MSConfig application, where you can indeed select an option that appears to increase the amount of cores used. In reality, Windows always uses the maximum amount of processor cores your CPU has. (Technically, only one core is used at the beginning of the boot process, but the additional cores are quickly activated.) Leave this option unchecked. It’s just a debugging option that allows you to set a maximum number of cores, so it would be useful if you wanted to force Windows to only use a single core on a multi-core system — but all it can do is restrict the amount of cores used. Clean Your Prefetch To Increase Startup Speed Windows watches the programs you run and creates .pf files in its Prefetch folder for them. The Prefetch feature works as a sort of cache — when you open an application, Windows checks the Prefetch folder, looks at the application’s .pf file (if it exists), and uses that as a guide to start preloading data that the application will use. This helps your applications start faster. Some Windows geeks have misunderstood this feature. They believe that Windows loads these files at boot, so your boot time will slow down due to Windows preloading the data specified in the .pf files. They also argue you’ll build up useless files as you uninstall programs and .pf files will be left over. In reality, Windows only loads the data in these .pf files when you launch the associated application and only stores .pf files for the 128 most recently launched programs. If you were to regularly clean out the Prefetch folder, not only would programs take longer to open because they won’t be preloaded, Windows will have to waste time recreating all the .pf files. You could also modify the PrefetchParameters setting to disable Prefetch, but there’s no reason to do that. Let Windows manage Prefetch on its own. Disable QoS To Increase Network Bandwidth Quality of Service (QoS) is a feature that allows your computer to prioritize its traffic. For example, a time-critical application like Skype could choose to use QoS and prioritize its traffic over a file-downloading program so your voice conversation would work smoothly, even while you were downloading files. Some people incorrectly believe that QoS always reserves a certain amount of bandwidth and this bandwidth is unused until you disable it. This is untrue. In reality, 100% of bandwidth is normally available to all applications unless a program chooses to use QoS. Even if a program does choose to use QoS, the reserved space will be available to other programs unless the program is actively using it. No bandwidth is ever set aside and left empty. Set DisablePagingExecutive to Make Windows Faster The DisablePagingExecutive registry setting is set to 0 by default, which allows drivers and system code to be paged to the disk. When set to 1, drivers and system code will be forced to stay resident in memory. Once again, some people believe that Windows isn’t smart enough to manage the pagefile on its own and believe that changing this option will force Windows to keep important files in memory rather than stupidly paging them out. If you have more than enough memory, changing this won’t really do anything. If you have little memory, changing this setting may force Windows to push programs you’re using to the page file rather than push unused system files there — this would slow things down. This is an option that may be helpful for debugging in some situations, not a setting to change for more performance. Process Idle Tasks to Free Memory Windows does things, such as creating scheduled system restore points, when you step away from your computer. It waits until your computer is “idle” so it won’t slow your computer and waste your time while you’re using it. Running the “Rundll32.exe advapi32.dll,ProcessIdleTasks” command forces Windows to perform all of these tasks while you’re using the computer. This is completely pointless and won’t help free memory or anything like that — all you’re doing is forcing Windows to slow your computer down while you’re using it. This command only exists so benchmarking programs can force idle tasks to run before performing benchmarks, ensuring idle tasks don’t start running and interfere with the benchmark. Delay or Disable Windows Services There’s no real reason to disable Windows services anymore. There was a time when Windows was particularly heavy and computers had little memory — think Windows Vista and those “Vista Capable” PCs Microsoft was sued over. Modern versions of Windows like Windows 7 and 8 are lighter than Windows Vista and computers have more than enough memory, so you won’t see any improvements from disabling system services included with Windows. Some people argue for not disabling services, however — they recommend setting services from “Automatic” to “Automatic (Delayed Start)”. By default, the Delayed Start option just starts services two minutes after the last “Automatic” service starts. Setting services to Delayed Start won’t really speed up your boot time, as the services will still need to start — in fact, it may lengthen the time it takes to get a usable desktop as services will still be loading two minutes after booting. Most services can load in parallel, and loading the services as early as possible will result in a better experience. The “Delayed Start” feature is primarily useful for system administrators who need to ensure a specific service starts later than another service. If you ever find a guide that recommends you set a little-known registry setting to improve performance, take a closer look — the change is probably useless. Want to actually speed up your PC? Try disabling useless startup programs that run on boot, increasing your boot time and consuming memory in the background. This is a much better tip than doing any of the above, especially considering most Windows PCs come packed to the brim with bloatware.     

    Read the article

  • Continuous Integration using Docker

    - by Leon Mergen
    One of the main advantages of Docker is the isolated environment it brings, and I want to leverage that advantage in my continuous integration workflow. A "normal" CI workflow goes something like this: Poll repository for changes Pull from repository Install dependencies Run tests In a Dockerized workflow, it would be something like this: Poll repository for changes Pull from repository Build docker image Run docker image as container Run tests Kill docker container My problem is with the "run tests" step: since Docker is an isolated environment, intuitively I would like to treat it as one; this means the preferred method of communication are sockets. However, this only works well in certain situations (a webapp, for example). When testing different kind of services (for example, a background service that only communicated with a database), a different approach would be required. What is the best way to approach this problem? Is it a problem with my application's design, and should I design it in a more TDD, service-oriented way that always listens on some socket? Or should I just give up on isolation, and do something like this: Poll repository for changes Pull from repository Build docker image Run docker image as container Open SSH session into container Run tests Kill docker container SSH'ing into the container seems like an ugly solution to me, since it requires deep knowledge of the contents of the container, and thus break the isolation. I would love to hear SO's different approaches to this problem.

    Read the article

  • Reclaim Vertical UI Space by Moving Your Tabs to the Side in Firefox

    - by Asian Angel
    Are you looking for a way to move your tabs to the side in Firefox and gain access to more vertical UI space? The Vertical Tabs extension for Firefox lets you accomplish both in a matter of moments. As soon as you install the extension and restart Firefox the Tab Bar will be automatically converted into a shiny new Vertical Tabs Sidebar. All that you have to do is start enjoying the extra vertical UI space. Some things to keep in mind when using the extension are: You can easily adjust the width of the sidebar to suit your needs using the mouse (very nice!) The Firefox Menu Button, Panorama Button, and Tab Control controls move to the bottom of the sidebar (see screenshot above) You can group tabs if needed or desired There is no option available to move the sidebar to the right side of the browser at the moment The use of Personas themes (or other themes) may affect how the text for the tabs will look (i.e. a slightly fuzzy shadow effect when not selected as seen in the screenshot above) Note: Works with Firefox 4.0b7 – 4.0.* Install the Vertical Tabs Extension [Mozilla Add-ons] Latest Features How-To Geek ETC Should You Delete Windows 7 Service Pack Backup Files to Save Space? What Can Super Mario Teach Us About Graphics Technology? Windows 7 Service Pack 1 is Released: But Should You Install It? How To Make Hundreds of Complex Photo Edits in Seconds With Photoshop Actions How to Enable User-Specific Wireless Networks in Windows 7 How to Use Google Chrome as Your Default PDF Reader (the Easy Way) Reclaim Vertical UI Space by Moving Your Tabs to the Side in Firefox Wind and Water: Puzzle Battles – An Awesome Game for Linux and Windows How Star Wars Changed the World [Infographic] Tabs Visual Manager Adds Thumbnailed Tab Switching to Chrome Daisies and Rye Swaying in the Summer Wind Wallpaper Read On Phone Pushes Data from Your Desktop to the Appropriate Android App

    Read the article

  • iOS 5: View Details Of Used and Unused Space Of Your iCloud Account

    - by Gopinath
    Apple’s iCloud is an awesome free storage service that lets you store music, photos, apps, calendars, documents, and more on the Cloud. Also it wirelessly pushes them to all your iOS devices automatically. The free iCloud service offers everyone with 5 GB space to starts with and once you reach the cap you can subscribe to a premium account with few dollars of fee. If you would like to monitor iCloud usage details here steps to be followed on an iOS device 1. Tap on Settings app 2. Choose iCloud  from the list of available options 3. From the list of iCloud settings tap on Storage & Backup option 4. Under Storage section you will find the details of iCloud usage – Total Storage and Available Storage via tech-recipes This article titled,iOS 5: View Details Of Used and Unused Space Of Your iCloud Account, was originally published at Tech Dreams. Grab our rss feed or fan us on Facebook to get updates from us.

    Read the article

  • links for 2011-03-02

    - by Bob Rhubart
    Oracle Technology Network Architect Day: Denver Registration is now open. Sessions will cover IT Optimization and consolidation, cloud computing, the evolving role of enterprise IT, and more. (tags: oracle otn entarch event denver) SOA Suite Integration: Part 2: A basic BPEL process (The Shorten Spot) The latest post in Anthony's Shorten's series about SOA Suite integration with Oracle Utilities Application Framework. (tags: oracle otn soa bpel soasuite) ADF: How to create web service based ADF pages The first in promised series of three posts on the topic by Marianne Horsch. (tags: oracle soa webservices adf) David Butler: MDM Poised for Growth (Oracle Master Data Management) David says: "Businesses are talking about the need to fix master data before they can successfully move forward on SOA initiatives. And the growing demands for compliance continue to be a major driver." (tags: oracle otn mdm) Cloud governance is about more than security | The Pervasive Data Center - CNET News Legal and regulatory procedures, transparency, service levels, indemnification, and more are all part of a broader governance landscape that requires IT to work closely with business users. Read this blog post by Gordon Haff on The Pervasive Data Center. (tags: ping.fm) Senthilkumar Rajendran's Blog: Horizontal Scaling OBIEE 11g (tags: ping.fm) InfoQ: Searching Without Objectives Kenneth O. Stanley considers that innovation is stifled when we are strictly following a high goal, and we would progress more when we are inclined to discovery rather than following an objective. (tags: ping.fm) InfoQ: Brownfield Software - Industrial Waste or Business Fertilizer? Josh Graham addresses 10 myths related to working on legacy software, attempting to prove that one can make good use of legacy code without having to rewrite the entire thing. (tags: ping.fm)

    Read the article

  • What happened to VM based deployments?

    - by user128670
    Watched some MountainWest RubyConf 2014 talks and noticed an interesting theme. Many dynamic programming environments back in the old days used to be self-contained VM images, e.g. SmallTalk, GemStone/S. One could checkpoint, modify, and ship these images wholesale and have it up and running with very little effort. Fast forward to now and I'm still using Make files to configure and install binaries. What happened?

    Read the article

  • Centos iptables blocks after reboot

    - by bilal
    I have a Centos5 installation with kvm on my server. I am using nat portforwarding to ssh my virtual machines. I have several iptables rules and saved then in /etc/sysconfig/iptables. After reboot, I see all these rules when I type service iptables status but I am getting a connection refused error. After typing service iptables restart everything works. I don't understand, why do I need to restart iptables again? Doesn't it restart on reboot?

    Read the article

  • Where to Perform Authentication in REST API Server?

    - by David V
    I am working on a set of REST APIs that needs to be secured so that only authenticated calls will be performed. There will be multiple web apps to service these APIs. Is there a best-practice approach as to where the authentication should occur? I have thought of two possible places. Have each web app perform the authentication by using a shared authentication service. This seems to be in line with tools like Spring Security, which is configured at the web app level. Protect each web app with a "gateway" for security. In this approach, the web app never receives unauthenticated calls. This seems to be the approach of Apache HTTP Server Authentication. With this approach, would you use Apache or nginx to protect it, or something else in between Apache/nginx and your web app? For additional reference, the authentication is similar to services like AWS that have a non-secret identifier combined with a shared secret key. I am also considering using HMAC. Also, we are writing the web services in Java using Spring. Update: To clarify, each request needs to be authenticated with the identifier and secret key. This is similar to how AWS REST requests work.

    Read the article

  • Bridging the gap between developers and testers with VS 2010

    - by Etienne Tremblay
    Hey everyone, I know it’s been an eternity since I blogged but I have so much to do that I unfortunately need to prioritize.  Vincent Grondin and I did a 7h presentation on the new developer and tester tools available in the VS 2010 suite.  It was a blast.  We did it in front of an audience (around 120) and it was taped.  We did it as a play and really didn’t look at the crowd at all we were training each other on the technology. It is now available for anyone that would like to watch it at this location: http://www.devteach.com/ALM-TFS2010-Bridgingthegap.aspx What we covered in the full day event was Migration to TFS 2010 (10h00) 1-Migration of VSS to TFS (20 min.) 2-Automating the Build (Something you can't do with VSS) ( 20 Min.) 3-User story (Real application context for this presentation) (20 min.) 10h00 Pause Manuel Tests by Dev ( 11h30) 4-Adding a tester to the team (Into to MTM) (20 min.) 5-Define tests (what is a white bug) (20 min.) 6-Fix the bug and show Intellitrace and Play back the test (20 min.) 12h15 Lunch Manuel testing for maintenance (13h30) 7- Implement new Feature (web service) and Identify bug with MTM and branch for a production fix and also add a new Build script (20 min.) 8- Fix bug in production branch, Playback tests, merge the change in main branch (20 min.) Manuel testing with the lab manager (14h30) 9- Intro to Lab manager and environment (20 min.) 10- Change build script to deploy to lab and test with web service in lab environment. (20 min.) 15h15 Pause Automate UI test with CodeUI (15h30) 11- Reducing the effort of testing the UI (20 min.) 12- Repeating testing to make sure the application is working properly (20 min.) 13- Automate Coded UI with the Lab environment (20 min.) 16h30 Conclusions As you can see lots of stuff!! Enjoy the show and let us know how you like it Cheers, ET Technorati Tags: VS 2010,Testing Tools,ALM,Training

    Read the article

  • qpid-cpp-client won't update through yum

    - by alexus
    somewhere around last week I received a notification for update, so I've tried "yum update" and that's what I'm getting... [alexus@wcmisdlin02 ~]$ sudo yum update Loaded plugins: refresh-packagekit Setting up Update Process Resolving Dependencies --> Running transaction check ---> Package qpid-cpp-client.x86_64 0:0.10-3.el6 will be updated --> Processing Dependency: libqpidclient.so.5()(64bit) for package: matahari-service-0.4.0-5.el6.x86_64 --> Processing Dependency: libqpidclient.so.5()(64bit) for package: matahari-host-0.4.0-5.el6.x86_64 --> Processing Dependency: libqpidclient.so.5()(64bit) for package: matahari-net-0.4.0-5.el6.x86_64 --> Processing Dependency: libqpidcommon.so.5()(64bit) for package: matahari-service-0.4.0-5.el6.x86_64 --> Processing Dependency: libqpidcommon.so.5()(64bit) for package: matahari-host-0.4.0-5.el6.x86_64 --> Processing Dependency: libqpidcommon.so.5()(64bit) for package: matahari-net-0.4.0-5.el6.x86_64 ---> Package qpid-cpp-client.x86_64 0:0.14-22.el6_3 will be an update ---> Package qpid-cpp-client-ssl.x86_64 0:0.10-3.el6 will be updated ---> Package qpid-cpp-client-ssl.x86_64 0:0.14-22.el6_3 will be an update ---> Package qpid-qmf.x86_64 0:0.10-6.el6 will be updated ---> Package qpid-qmf.x86_64 0:0.14-14.el6_3 will be an update --> Finished Dependency Resolution Error: Package: matahari-net-0.4.0-5.el6.x86_64 (@anaconda-ScientificLinux-201107271550.x86_64) Requires: libqpidcommon.so.5()(64bit) Removing: qpid-cpp-client-0.10-3.el6.x86_64 (@anaconda-ScientificLinux-201107271550.x86_64) libqpidcommon.so.5()(64bit) Updated By: qpid-cpp-client-0.14-22.el6_3.x86_64 (sl-security) Not found Error: Package: matahari-net-0.4.0-5.el6.x86_64 (@anaconda-ScientificLinux-201107271550.x86_64) Requires: libqpidclient.so.5()(64bit) Removing: qpid-cpp-client-0.10-3.el6.x86_64 (@anaconda-ScientificLinux-201107271550.x86_64) libqpidclient.so.5()(64bit) Updated By: qpid-cpp-client-0.14-22.el6_3.x86_64 (sl-security) Not found Error: Package: matahari-service-0.4.0-5.el6.x86_64 (@anaconda-ScientificLinux-201107271550.x86_64) Requires: libqpidclient.so.5()(64bit) Removing: qpid-cpp-client-0.10-3.el6.x86_64 (@anaconda-ScientificLinux-201107271550.x86_64) libqpidclient.so.5()(64bit) Updated By: qpid-cpp-client-0.14-22.el6_3.x86_64 (sl-security) Not found Error: Package: matahari-service-0.4.0-5.el6.x86_64 (@anaconda-ScientificLinux-201107271550.x86_64) Requires: libqpidcommon.so.5()(64bit) Removing: qpid-cpp-client-0.10-3.el6.x86_64 (@anaconda-ScientificLinux-201107271550.x86_64) libqpidcommon.so.5()(64bit) Updated By: qpid-cpp-client-0.14-22.el6_3.x86_64 (sl-security) Not found Error: Package: matahari-host-0.4.0-5.el6.x86_64 (@anaconda-ScientificLinux-201107271550.x86_64) Requires: libqpidcommon.so.5()(64bit) Removing: qpid-cpp-client-0.10-3.el6.x86_64 (@anaconda-ScientificLinux-201107271550.x86_64) libqpidcommon.so.5()(64bit) Updated By: qpid-cpp-client-0.14-22.el6_3.x86_64 (sl-security) Not found Error: Package: matahari-host-0.4.0-5.el6.x86_64 (@anaconda-ScientificLinux-201107271550.x86_64) Requires: libqpidclient.so.5()(64bit) Removing: qpid-cpp-client-0.10-3.el6.x86_64 (@anaconda-ScientificLinux-201107271550.x86_64) libqpidclient.so.5()(64bit) Updated By: qpid-cpp-client-0.14-22.el6_3.x86_64 (sl-security) Not found You could try using --skip-broken to work around the problem You could try running: rpm -Va --nofiles --nodigest [alexus@wcmisdlin02 ~]$ any ideas?

    Read the article

  • SüdLeasing reduziert mit e-Lease auf SOA-Basis Verwaltungskosten um über 1,5 Mio. Euro

    - by franziska.schneider(at)oracle.com
    Mit dem SüdLeasing Projekt e-Lease (electronic leasing process) wurde laut Dr. Buchacker eine maßgeschneiderte, exzellent ausbaufähige „Zukunftsplattform" geschaffen. Die Geschäftsprozesse des Unternehmens wurden gemeinsam mit Oracle und dem langjährigen Oracle Partner PROMATIS auf der neuen Plattform einheitlich abgebildet und verschlankt. Dabei wurden auch bestehende Legacy-Systeme einbezogen. Heute werden auf dieser Oracle basierten service-orientierten Architektur (SOA) die betrieblichen Abläufe automatisiert, optimiert und flexibel weiterentwickelt. Zunächst stand das Finanzdienstleistungsunternehmen vor der Herausforderung unternehmensweit die Durchlaufzeiten, die Kooperation und den Service durch Business Process Streamlining zu verbessern. Neben Einsparungen bei Aktenordnern, Ablagematerialien und bei der Archivierung sollten vor allem die Abteilungen „Markt" und „Marktfolge" mittels einer durchgängigen IT-Unterstützung der Arbeitsabläufe besser ineinander greifen. Parallel dazu beabsichtigte man durch sukzessive Entlastung der Mitarbeiter in den drei Haupt- und Bearbeitungsstandorten sowie in den 19 Vertriebsniederlassungen zusätzliche Kapazitäten zu gewinnen. Bereits kurz nach der Einführung von e-Lease in 2008 hatten sich die Verwaltungskosten in der SüdLeasing Zentrale um rund 1,5 Mio. Euro reduziert. Link zur kompletten Kundenreferenz Oracle und PROMATIS haben mit den im Projekt eingesetzten Oracle Produkten, dem Know-how und Engagement der Berater maßgeblich zum Erfolg von e-Lease beigetragen." - Dr. Ullrich Buchacker, Direktor und Abteilungsleiter IT/Organisation, SüdLeasing GmbH.

    Read the article

  • AI for a mixed Turn Based + Real Time battle system - Something "Gambit like" the right approach?

    - by Jason L.
    This is maybe a question that's been asked 100 times 1,000 different ways. I apologize for that :) I'm in the process of building the AI for a game I'm working on. The game is a turn based one, in the vein of Final Fantasy but also has a set of things that happen in real time (reactions). I've experimented with FSM, HFSMs, and Behavior Trees. None of them felt "right" to me and all felt either too limiting or too generic / big. The idea I'm toying with now is something like a "Rules engine" that could be likened to the Gambit system from Final Fantasy 12. I would have a set of predefined personalities. Each of these personalities would have a set of conditions it would check on each event (Turn start, time to react, etc). These conditions would be priority ordered, and the first one that returns true would be the action I take. These conditions can also point to a "choice" action, which is just an action that will make a choice based on some Utility function. Sort of a mix of FSM/HFSM and a Utility Function approach. So, a "gambit" with the personality of "Healer" may look something like this: (ON) Ally HP = 0% - Choose "Relife" spell (ON) Ally HP < 50% - Choose Heal spell (ON) Self HP < 65% - Choose Heal spell (ON) Ally Debuff - Choose Debuff Removal spell (ON) Ally Lost Buff - Choose Buff spell Likewise, a "gambit" with the personality of "Agressor" may look like this: (ON) Foe HP < 10% - Choose Attack skill (ON) Foe any - Choose target - Choose Attack skill (ON) Self Lost Buff - Choose Buff spell (ON) Foe HP = 0% - Taunt the player What I like about this approach is it makes sense in my head. It also would be extremely easy to build an "AI Editor" with an approach like this. What I'm worried about is.. would it be too limiting? Would it maybe get too complicated? Does anyone have any experience with AIs in Turn Based games that could maybe provide me some insight into this approach.. or suggest a different approach? Many thanks in advance!!!

    Read the article

  • Windows Intune, Cloud Desktop management

    - by David Nudelman
    As a part of Microsoft Cloud computing strategy, Windows Intune beta was released today. Here’s a quick overview of what customers and IT consultants can do with the cloud service component of Windows Intune: Manage PCs through web-based console: Windows Intune provides a web-based console for IT to administrate their PCs. Administrators can manage PCs from anywhere. Manage updates: Administrators can centrally manage the deployment of Microsoft updates and service packs to all PCs. Protection from malware: Windows Intune helps protect PCs from the latest threats with malware protection built on the Microsoft Malware Protection Engine that you can manage through the Web-based console. Proactively monitor PCs: Receive alerts on updates and threats so that you can proactively identify and resolve problems with your PCs—before it impacts end users and your business. Provide remote assistance: Resolve PC issues, regardless of where you or your users are located, with remote assistance. Track hardware and software inventory: Track hardware and software assets used in your business to efficiently manage your assets, licenses, and compliance. Set security policies: Centrally manage update, firewall, and malware protection policies, even on remote machines outside the corporate network. And here a quick video about Windows Intune For support and questions go to : TechNet Forums for Intune Regards, David Nudelman

    Read the article

  • N64oid Brings N64 Emulation to Android Devices

    - by ETC
    Craving some Ocarina of Time adventures, Super Mario 64 antics, or some Star Fox 64 flying on your Android device? N64oid brings retro emulation of Nintendo’s popular N64 console to Android devices. N64oid is an N64 console emulator for Android devices. You’ll need a copy of the $5.99 emulator, ROMs (from the usual sources, unless you’ve got a ROM ripping setup in your basement and a stack of old cartridges), and a suitably speedy Android device. Older Android devices will find the playback choppy and subpar, but newer and speedier devices like the Nexus-One and Samsung Galaxy should have no problem handling the emulator. Like all emulators N64oid is a work in progress and emulating an entire closed-system console on a totally different set of hardware is never a perfect 1:1 emulation, but if you’re a die hard fan of classic N64 titles (check out this list of top ranked titles to inspire some nostalgia) N64oid is worth the price of a burger for sure. N64oid [Android Market via Download Squad] Latest Features How-To Geek ETC Learn To Adjust Contrast Like a Pro in Photoshop, GIMP, and Paint.NET Have You Ever Wondered How Your Operating System Got Its Name? Should You Delete Windows 7 Service Pack Backup Files to Save Space? What Can Super Mario Teach Us About Graphics Technology? Windows 7 Service Pack 1 is Released: But Should You Install It? How To Make Hundreds of Complex Photo Edits in Seconds With Photoshop Actions Super-Charge GIMP’s Image Editing Capabilities with G’MIC [Cross-Platform] Access and Manage Your Ubuntu One Account in Chrome and Iron Mouse Over YouTube Previews YouTube Videos in Chrome Watch a Machine Get Upgraded from MS-DOS to Windows 7 [Video] Bring the Whole Ubuntu Gang Home to Your Desktop with this Mascots Wallpaper Hack Apart a Highlighter to Create UV-Reactive Flowers [Science]

    Read the article

  • Case Study: Polystar Improves Telecom Networks Performance with Embedded MySQL

    - by Bertrand Matthelié
    Polystar delivers and supports systems that increase the quality, revenue and customer satisfaction of telecommunication services. Headquarted in Sweden, Polystar helps operators worldwide including Telia, Tele2, Telekom Malysia and T-Mobile to monitor their network performance and improve service levels. Challenges Deliver complete turnkey solutions to customers integrating a database ensuring high performance at scale, while being very easy to use, manage and optimize. Enable the implementation of distributed architectures including one database per server while maintaining a low Total Cost of Ownership (TCO). Avoid growing database complexity as the volume of mobile data to monitor and analyze drastically increases. Solution Evaluation of several databases and selection of MySQL based on its high performance, manageability, and low TCO. The MySQL databases implemented within the Polystar solutions handle on average 3,000 to 5,000 transactions per second. Up to 50 million records are inserted every day in each database. Typical installations include between 50 and 100 MySQL databases, up to 300 for the largest ones. Data is then periodically aggregated, with the original records being overwritten, as the need for detailed information becomes unnecessary to operators after a few weeks. The exponential growth in mobile data traffic driven by the proliferation of smartphones and usage of social media requires ever more powerful solutions to monitor, analyze and turn network data into actionable business intelligence. With MySQL, Polystar can deliver powerful, yet easy to manage, solutions to its customers. MySQL-based Polystar solutions enable operators to monitor, manage and improve the service levels of their telecom networks in over a dozen countries from a single location. The new and innovative MySQL features constantly delivered by Oracle help ensure Polystar that it will be able to meet its customer’s needs as they evolve. “MySQL has been a great embedded database choice for us. It delivers the high performance we need while remaining very easy to use, manage and tune. Power and simplicity at its best.” Mats Söderlindh, COO at Polystar.

    Read the article

  • OpenSource License Validation [closed]

    - by Macmade
    I'm basically looking for some kind of FLOSS/OpenSource license validation service. I have special needs for some projects I'd like to open-source. I know there's actually tons of different FLOSS/OpenSource licenses, each one suitable for some specific purpose, and that creating a «new» one is not something recommended, usually. Anyway, even if I'm not an expert in the legal domain, I've got some experience with FLOSS/OpenSource, at a legal level, and it seems there's just no license covering my needs. I actually wrote the license terms I'd like to use, and contacted the FSF, asking them to review that license, as it seems (at least that's written on their website) they can do such review work. No answer. I tried repetitively, but no luck. So I'm currently looking for an alternate legal expertise about that specific license. I don't mind paying such a service, as long as I can be sure the license can be recognised as a FLOSS/OpenSource license. About the license, it's basically a mix of a BSD (third-clause) with a BOOST software license. The difference is about redistribution. Source code redistribution shall retain the copyright novices. The same applies for binary redistribution (like BSD), unless it's distributed as a library (more like BOOST). I hope this question is OK for programmers.stackexchange. I'm usually more active on StackOverflow, but it just seems the right place for such a question. So thank you for your time and enlightened advices. : )

    Read the article

  • How do I make my USB Bluetooth dongle work in Ubuntu 11.04 ? (Can't init device hci0: Connection timed out (110)) [closed]

    - by MaikoID
    I've a USB bluetooth dongle root@maiko-cce-lin:~# lsusb | grep Bluetooth Bus 001 Device 007: ID 0a12:0001 Cambridge Silicon Radio, Ltd Bluetooth Dongle (HCI mode) that isn't working properly, hardly-ever it works but stops working in my next reboot. what I've tried it isn't software blocked root@maiko-cce-lin:~# rfkill list 0: phy0: Wireless LAN Soft blocked: no Hard blocked: no 1: hci0: Bluetooth Soft blocked: no Hard blocked: no my device is recognized by hciconfig root@maiko-cce-lin:~# hciconfig -a hci0: Type: BR/EDR Bus: USB BD Address: 00:1F:81:00:01:1C ACL MTU: 1021:4 SCO MTU: 180:1 DOWN RX bytes:330 acl:0 sco:0 events:8 errors:0 TX bytes:24 acl:0 sco:0 commands:30 errors:22 Features: 0xff 0x3e 0x09 0x76 0x80 0x01 0x00 0x80 Packet type: DM1 DM3 DM5 DH1 DH3 DH5 HV1 HV2 HV3 Link policy: Link mode: SLAVE ACCEPT but I can't turn on my hci interface root@maiko-cce-lin:~# hciconfig hci up Can't init device hci0: Connection timed out (110) I don't understand why.. the hcitool command doesn't show any device. root@maiko-cce-lin:~# hcitool dev Devices: I've tried to restart my bluetooth service too with this command and make all these previous commands again but without success. root@maiko-cce-lin:~# service bluetooth restart * Stopping bluetooth [ OK ] * Starting bluetooth [ OK ] root@maiko-cce-lin:~# The dongle works if you disconnect it from usb, wait a few seconds and connect it again. so there must be better solution for it ( a solution not involving physically removing the dongle!)

    Read the article

  • Is there any descent open-source search engine solutions?

    - by Nazariy
    Few weeks ago my friend asked me how hard is it to launch your own search engine service with list of websites that suppose to be crawled time to time. First what come at my mind was Google Custom Search however pricing policy is quite tricky and would drain your budget if you reach 500K queries per year. Another solution I found here was SearchBlox, which can be compared to Google Mini service. It's quite good solution if you planing to cover search over small amount of websites but for larger projects it is not very handy. I also found few other search platforms like Lucene, Hadoop and Xapian which seems to be quite powerful solutions to reach Google search quality, and Nutch as a web crawler. As most of open-source projects they share same problem, luck of comprehensive guidance of usage, examples and it's expected that you are expert in this subject. I'm wondering if any of you using this solutions, which of them would you recommend, and what should I be aware of?

    Read the article

< Previous Page | 498 499 500 501 502 503 504 505 506 507 508 509  | Next Page >