Search Results

Search found 73851 results on 2955 pages for 'time machine'.

Page 38/2955 | < Previous Page | 34 35 36 37 38 39 40 41 42 43 44 45  | Next Page >

  • Which programming language could I use for Natural Language Processing to extract clinical words?

    - by MACEE
    I am going to do entity extraction (like Named Entity Recognition) from clinical free text (unstructured raw text such as discharge summaries) and these entities could be any medical problem, medical tests or treatments. I am going to use one of i2b2 datasets (https://www.i2b2.org/) if case you are familiar with that. I am new to the NLP(Natural Language Processing) field and I need a programming language to support NLP tasks and also easily connect to the available libraries of machine learning algorithms like CRF. I don't know much java and I heard about Python, Perl and Scala but I am not sure which one would be the best option for this task?

    Read the article

  • Static IP Address on Ubuntu 12.04 Virtual Machine

    - by chrisnankervis
    I've setup a VM running Ubuntu 12.04 specifically for local web development and am having some problems ensuring it has a static IP address. A static IP address is important as I'm using the IP address in my hosts file to assign a .local suffix to addresses used both in browser and to connect to the correct database on the VM. Currently, every time I connect to a new network or my VM is assigned a new IP address I need to reconfigure my whole environment which is becoming quite a pain. It also probably doesn't help that the default-lease-time on the Ubuntu VM is set to 1800 by default. At the moment I'm using VMWare Fusion and the Network Adapter is enabled and set to "Autodetect" under Bridged Networking. I've tried to set a static IP address within the dhcpd.conf using the code below: host ubuntu { hardware ethernet 00:50:56:35:0f:f1; fixed-address: 192.168.100.100; } The fixed-address that I've used is also outside the range specified in the subnet block (which in this case is 192.168.100.128 to 192.168.100.254). I've tried adding and removing the network adapter and restarting my Mac after each time to no avail. Below is an ifconfig of the VM that might be of some help: eth0 Link encap:Ethernet HWaddr 00:50:56:35:0f:f1 inet addr:192.168.0.25 Bcast:192.168.0.255 Mask:255.255.255.0 inet6 addr: fe80::250:56ff:fe35:ff1/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:1624 errors:0 dropped:0 overruns:0 frame:0 TX packets:416 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:147348 (147.3 KB) TX bytes:41756 (41.7 KB) lo Link encap:Local Loopback inet addr:127.0.0.1 Mask:255.0.0.0 inet6 addr: ::1/128 Scope:Host UP LOOPBACK RUNNING MTU:16436 Metric:1 RX packets:0 errors:0 dropped:0 overruns:0 frame:0 TX packets:0 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:0 RX bytes:0 (0.0 B) TX bytes:0 (0.0 B) Are there any specific issues with 12.04 that I'm missing? Otherwise has anyone else got any ideas? Thanks in advance.

    Read the article

  • how to make a continuous machine gun sound-effect

    - by Jan
    I am trying to make an entity fire one or more machine-guns. For each gun I store the time between shots (1.0 / firing rate) and the time since the last shot. Also I've loaded ~10 different gun-shot sound-effects. Now, for each gun I do the following: function update(deltatime): timeSinceLastShot += deltatime if timeSinceLastShot >= timeBetweenShots + verySmallRandomValue(): timeSinceLastShot -= timeBetweenShots if gunIsFiring: displayMuzzleFlash() spawnBullet() selectRandomSound().play() But now I often get a crackling noise (which I assume is when two or more guns are firing at the same time and confuse the sound-device). My question is whether A) This a common problem and there is a well-known solution, maybe to do with the channels or something, or B) I am using a completely wrong approach to the task. I had a look at some sound-assets for other games and they used complete burst with multiple shots. I suppose I could try that, but I would like to have organic little hickups in the gun-fire (that's what the random value is for) to make the game more gritty and dirty. I am using Panda3D, but I had the exact same problem in PyGame and SDL. [edit] Thanks a lot for the answers so far! One more problem with faking it though: Now how do I stop the sound? Let's say I have an effect with 5 bangs... *bang* *bang* *bang* *bang* *bang* And I magically manage to loop it so that there's no gap or overlap if the player fires more than 5 shots. Now, what do I do if the player stops firing halfway through the third bang? How do I know how long to keep playing the sample so that the third bang is completed and I can start playing the rumbling echo of the last shot? Of course I can look up the shot/pause timing of that sound-sample and code accordingly, but it feels extremely hacky.

    Read the article

  • Why is Android VM-based? [closed]

    - by adib
    By about 2004, it was clear that ARM is the clear winner for mobile CPUs, beating out MIPS, SH3, and DragonBall. PocketPC (Windows Mobile) applications was natively-compiled (at least most of them - except for .NET compact and its competitors). Likewise, Apple's iOS (named iPhone OS at the time) prefers natively-compiled applications. Then why Android chose a virtual machine based system stack? (the Dalvik VM). Wouldn't it be simpler to just compile applications down to ARM code using GCJ or something? Is the decision influenced by the J2ME-way of doing things, or was just because it's "cool"? Perhaps like most things Java, the culture that prefers multiple levels of indirection and abstractions, they just added another layer of abstraction for "just in case"?

    Read the article

  • Google I/O 2010 - Make your app real-time with PubSubHubbub

    Google I/O 2010 - Make your app real-time with PubSubHubbub Google I/O 2010 - Make your application real-time with PubSubHubbub Social Web 201 Brett Slatkin This session will go over how to add support for the PubSubHubbub protocol to your website. You'll learn how to turn Atom and RSS feeds into real-time streams. We'll go over how to consume real-time data streams and how to make your website reactive to what's happening on the web right now. For all I/O 2010 sessions, please go to code.google.com From: GoogleDevelopers Views: 5 0 ratings Time: 55:46 More in Science & Technology

    Read the article

  • Construct sentences from tabular data

    - by Sumeet
    I have a huge set of html files an have to retrieve the meaningful information from them. Most of the task is accomplished, now the problem is with HTML tables. I have some literature on how to extract meaningful tables from html, but my problem is with creating meaningful sentences from tabular data (or attribute value pairs extracted from a table). Are there any NLP/Machine learning techniques to do this? Here is what I expect. Suppose below is a sample table: col_Name: Sumeet col_year: 2011 col_winner: quiz Can this be made to something meaningful like "Sumeet won quiz in 2011"?

    Read the article

  • Is it possible to make the desktop background transparent or otherwise hidden?

    - by EndangeredMassa
    I'm running Ubuntu in a virtual machine via VirtualBox. I have the seamless mode turned on, which is pretty cool. However, if I move an Ubuntu window around quickly, I can see the redraw of the ubuntu background quickly before it's hidden by VirtualBox again. This isn't a huge deal, but I'd like to fix it, if possible. I see two possible options that don't involve changing VirtualBox code: 1. Make the Ubuntu desktop transparent 2. Make the Ubuntu desktop hidden entirely Is it possible to do either? I know that Compiz Fusion has/had a feature to do this for their cube effects, but I don't think that I can run this on the VM. And, even if I could, I don't want to run those services for this one small feature.

    Read the article

  • JavaScript malware analysis

    - by begueradj
    I want to test websites for JavaScript malware presence . I plan to develop a Python program that sends the URL of a given website to a virtual machine where the dynamic execution of the eventual malicious JavaScript embedded in the website's page is monitored. My questions: Should my VM be Windows or Linux ? What if the malware damages my VM: is there a hint how to avoid that ? Or launch a new VM automatically instead ? If I use telnet client library to communicate with the VM: must I implement a server within the VM to deal with my queries or can I overcome this ? I am jut looing for hints, general ideas. Thank you for any help.

    Read the article

  • How to display time in the top panel?

    - by Mörre
    I thought I already had the time up there in the top bar, and it may have been so in previous Ubuntu versions (don't remember, my Ubuntu laptop is just one of three computers I use). Only that I just noticed - me being someone who never wears a watch, has the cellphone turned off 95% of the time and relying on the computer to tell the time - that there is no time being displayed anywhere, and I had expected it in the top bar on the Unity desktop. I searched around but found no obvious solution, but I'm sure someone immediately knows how I can get my time (back?) into the top bar?

    Read the article

  • Programming Language Family Tree?

    - by user134353
    As a man interested in programming, I must ask if there is a cataloged hierarchy of languages. I'd like to learn to actually understand what's happening- that is to say, I don't want to use a compiler until I understand what a compiler does and how to make my own. I really do want to start from total scratch. I'm told that means "machine code"? I don't know. What I do know is that "C++" is not the start. I'm not interested in learning that until I can actually break software down to its very base and see how the pieces go together.

    Read the article

  • Why does my MacBook Pro have long ping times over Wi-Fi?

    - by randynov
    I have been having problems connecting with my Wi-Fi. It is weird, the ping times to the router (<30 feet away) seem to surge, often getting over 10 seconds before slowly coming back down. You can see the trend below. I'm on a MacBook Pro and have done the normal stuff (reset the PRAM and SMC, changed wireless channels, etc.). It happens across different routers, so I think it must be my laptop, but I don't know what it could be. The RSSI value hovers around -57, but I've seen the transmit rate flip between 0, 48 and 54. The signal strength is ~60% with 9% noise. Currently, there are 17 other wireless networks in range, but only one in the same channel. 1 - How can I figure out what's going on? 2 - How can I correct the situation? PING 192.168.1.1 (192.168.1.1): 56 data bytes 64 bytes from 192.168.1.1: icmp_seq=0 ttl=254 time=781.107 ms 64 bytes from 192.168.1.1: icmp_seq=1 ttl=254 time=681.551 ms 64 bytes from 192.168.1.1: icmp_seq=2 ttl=254 time=610.001 ms 64 bytes from 192.168.1.1: icmp_seq=3 ttl=254 time=544.915 ms 64 bytes from 192.168.1.1: icmp_seq=4 ttl=254 time=547.622 ms 64 bytes from 192.168.1.1: icmp_seq=5 ttl=254 time=468.914 ms 64 bytes from 192.168.1.1: icmp_seq=6 ttl=254 time=237.368 ms 64 bytes from 192.168.1.1: icmp_seq=7 ttl=254 time=229.902 ms 64 bytes from 192.168.1.1: icmp_seq=8 ttl=254 time=11754.151 ms 64 bytes from 192.168.1.1: icmp_seq=9 ttl=254 time=10753.943 ms 64 bytes from 192.168.1.1: icmp_seq=10 ttl=254 time=9754.428 ms 64 bytes from 192.168.1.1: icmp_seq=11 ttl=254 time=8754.199 ms 64 bytes from 192.168.1.1: icmp_seq=12 ttl=254 time=7754.138 ms 64 bytes from 192.168.1.1: icmp_seq=13 ttl=254 time=6754.159 ms 64 bytes from 192.168.1.1: icmp_seq=14 ttl=254 time=5753.991 ms 64 bytes from 192.168.1.1: icmp_seq=15 ttl=254 time=4754.068 ms 64 bytes from 192.168.1.1: icmp_seq=16 ttl=254 time=3753.930 ms 64 bytes from 192.168.1.1: icmp_seq=17 ttl=254 time=2753.768 ms 64 bytes from 192.168.1.1: icmp_seq=18 ttl=254 time=1753.866 ms 64 bytes from 192.168.1.1: icmp_seq=19 ttl=254 time=753.592 ms 64 bytes from 192.168.1.1: icmp_seq=20 ttl=254 time=517.315 ms 64 bytes from 192.168.1.1: icmp_seq=37 ttl=254 time=1.315 ms 64 bytes from 192.168.1.1: icmp_seq=38 ttl=254 time=1.035 ms 64 bytes from 192.168.1.1: icmp_seq=39 ttl=254 time=4.597 ms 64 bytes from 192.168.1.1: icmp_seq=21 ttl=254 time=18010.681 ms 64 bytes from 192.168.1.1: icmp_seq=22 ttl=254 time=17010.449 ms 64 bytes from 192.168.1.1: icmp_seq=23 ttl=254 time=16010.430 ms 64 bytes from 192.168.1.1: icmp_seq=24 ttl=254 time=15010.540 ms 64 bytes from 192.168.1.1: icmp_seq=25 ttl=254 time=14010.450 ms 64 bytes from 192.168.1.1: icmp_seq=26 ttl=254 time=13010.175 ms 64 bytes from 192.168.1.1: icmp_seq=27 ttl=254 time=12010.282 ms 64 bytes from 192.168.1.1: icmp_seq=28 ttl=254 time=11010.265 ms 64 bytes from 192.168.1.1: icmp_seq=29 ttl=254 time=10010.285 ms 64 bytes from 192.168.1.1: icmp_seq=30 ttl=254 time=9010.235 ms 64 bytes from 192.168.1.1: icmp_seq=31 ttl=254 time=8010.399 ms 64 bytes from 192.168.1.1: icmp_seq=32 ttl=254 time=7010.144 ms 64 bytes from 192.168.1.1: icmp_seq=33 ttl=254 time=6010.113 ms 64 bytes from 192.168.1.1: icmp_seq=34 ttl=254 time=5010.025 ms 64 bytes from 192.168.1.1: icmp_seq=35 ttl=254 time=4009.966 ms 64 bytes from 192.168.1.1: icmp_seq=36 ttl=254 time=3009.825 ms 64 bytes from 192.168.1.1: icmp_seq=40 ttl=254 time=16000.676 ms 64 bytes from 192.168.1.1: icmp_seq=41 ttl=254 time=15000.477 ms 64 bytes from 192.168.1.1: icmp_seq=42 ttl=254 time=14000.388 ms 64 bytes from 192.168.1.1: icmp_seq=43 ttl=254 time=13000.549 ms 64 bytes from 192.168.1.1: icmp_seq=44 ttl=254 time=12000.469 ms 64 bytes from 192.168.1.1: icmp_seq=45 ttl=254 time=11000.332 ms 64 bytes from 192.168.1.1: icmp_seq=46 ttl=254 time=10000.339 ms 64 bytes from 192.168.1.1: icmp_seq=47 ttl=254 time=9000.338 ms 64 bytes from 192.168.1.1: icmp_seq=48 ttl=254 time=8000.198 ms 64 bytes from 192.168.1.1: icmp_seq=49 ttl=254 time=7000.388 ms 64 bytes from 192.168.1.1: icmp_seq=50 ttl=254 time=6000.217 ms 64 bytes from 192.168.1.1: icmp_seq=51 ttl=254 time=5000.084 ms 64 bytes from 192.168.1.1: icmp_seq=52 ttl=254 time=3999.920 ms 64 bytes from 192.168.1.1: icmp_seq=53 ttl=254 time=3000.010 ms 64 bytes from 192.168.1.1: icmp_seq=54 ttl=254 time=1999.832 ms 64 bytes from 192.168.1.1: icmp_seq=55 ttl=254 time=1000.072 ms 64 bytes from 192.168.1.1: icmp_seq=58 ttl=254 time=1.125 ms 64 bytes from 192.168.1.1: icmp_seq=59 ttl=254 time=1.070 ms 64 bytes from 192.168.1.1: icmp_seq=60 ttl=254 time=2.515 ms

    Read the article

  • Why does my macbook pro have long ping times over wifi?

    - by randynov
    I have been having problems connecting with my wifi. It is weird, the ping times to the router (<30 feet away) seem to surge, often getting over 10s before slowly coming back down. You can see the trend below. I'm on a macbook pro and have done the normal stuff (reset the pram and smc, changed wireless channels, etc.). It happens across different routers, so I think it must be my laptop, but I don't know what it could be. The RSSI value hovers around -57, but I've seen the transmit rate flip between 0, 48 & 54. The signal strength is ~60% with 9% noise. Currently, there are 17 other wireless networks in range, but only one in the same channel. 1 - How can I figure out what's going on? 2 - How can I correct the situation? TIA! Randall PING 192.168.1.1 (192.168.1.1): 56 data bytes 64 bytes from 192.168.1.1: icmp_seq=0 ttl=254 time=781.107 ms 64 bytes from 192.168.1.1: icmp_seq=1 ttl=254 time=681.551 ms 64 bytes from 192.168.1.1: icmp_seq=2 ttl=254 time=610.001 ms 64 bytes from 192.168.1.1: icmp_seq=3 ttl=254 time=544.915 ms 64 bytes from 192.168.1.1: icmp_seq=4 ttl=254 time=547.622 ms 64 bytes from 192.168.1.1: icmp_seq=5 ttl=254 time=468.914 ms 64 bytes from 192.168.1.1: icmp_seq=6 ttl=254 time=237.368 ms 64 bytes from 192.168.1.1: icmp_seq=7 ttl=254 time=229.902 ms 64 bytes from 192.168.1.1: icmp_seq=8 ttl=254 time=11754.151 ms 64 bytes from 192.168.1.1: icmp_seq=9 ttl=254 time=10753.943 ms 64 bytes from 192.168.1.1: icmp_seq=10 ttl=254 time=9754.428 ms 64 bytes from 192.168.1.1: icmp_seq=11 ttl=254 time=8754.199 ms 64 bytes from 192.168.1.1: icmp_seq=12 ttl=254 time=7754.138 ms 64 bytes from 192.168.1.1: icmp_seq=13 ttl=254 time=6754.159 ms 64 bytes from 192.168.1.1: icmp_seq=14 ttl=254 time=5753.991 ms 64 bytes from 192.168.1.1: icmp_seq=15 ttl=254 time=4754.068 ms 64 bytes from 192.168.1.1: icmp_seq=16 ttl=254 time=3753.930 ms 64 bytes from 192.168.1.1: icmp_seq=17 ttl=254 time=2753.768 ms 64 bytes from 192.168.1.1: icmp_seq=18 ttl=254 time=1753.866 ms 64 bytes from 192.168.1.1: icmp_seq=19 ttl=254 time=753.592 ms 64 bytes from 192.168.1.1: icmp_seq=20 ttl=254 time=517.315 ms 64 bytes from 192.168.1.1: icmp_seq=37 ttl=254 time=1.315 ms 64 bytes from 192.168.1.1: icmp_seq=38 ttl=254 time=1.035 ms 64 bytes from 192.168.1.1: icmp_seq=39 ttl=254 time=4.597 ms 64 bytes from 192.168.1.1: icmp_seq=21 ttl=254 time=18010.681 ms 64 bytes from 192.168.1.1: icmp_seq=22 ttl=254 time=17010.449 ms 64 bytes from 192.168.1.1: icmp_seq=23 ttl=254 time=16010.430 ms 64 bytes from 192.168.1.1: icmp_seq=24 ttl=254 time=15010.540 ms 64 bytes from 192.168.1.1: icmp_seq=25 ttl=254 time=14010.450 ms 64 bytes from 192.168.1.1: icmp_seq=26 ttl=254 time=13010.175 ms 64 bytes from 192.168.1.1: icmp_seq=27 ttl=254 time=12010.282 ms 64 bytes from 192.168.1.1: icmp_seq=28 ttl=254 time=11010.265 ms 64 bytes from 192.168.1.1: icmp_seq=29 ttl=254 time=10010.285 ms 64 bytes from 192.168.1.1: icmp_seq=30 ttl=254 time=9010.235 ms 64 bytes from 192.168.1.1: icmp_seq=31 ttl=254 time=8010.399 ms 64 bytes from 192.168.1.1: icmp_seq=32 ttl=254 time=7010.144 ms 64 bytes from 192.168.1.1: icmp_seq=33 ttl=254 time=6010.113 ms 64 bytes from 192.168.1.1: icmp_seq=34 ttl=254 time=5010.025 ms 64 bytes from 192.168.1.1: icmp_seq=35 ttl=254 time=4009.966 ms 64 bytes from 192.168.1.1: icmp_seq=36 ttl=254 time=3009.825 ms 64 bytes from 192.168.1.1: icmp_seq=40 ttl=254 time=16000.676 ms 64 bytes from 192.168.1.1: icmp_seq=41 ttl=254 time=15000.477 ms 64 bytes from 192.168.1.1: icmp_seq=42 ttl=254 time=14000.388 ms 64 bytes from 192.168.1.1: icmp_seq=43 ttl=254 time=13000.549 ms 64 bytes from 192.168.1.1: icmp_seq=44 ttl=254 time=12000.469 ms 64 bytes from 192.168.1.1: icmp_seq=45 ttl=254 time=11000.332 ms 64 bytes from 192.168.1.1: icmp_seq=46 ttl=254 time=10000.339 ms 64 bytes from 192.168.1.1: icmp_seq=47 ttl=254 time=9000.338 ms 64 bytes from 192.168.1.1: icmp_seq=48 ttl=254 time=8000.198 ms 64 bytes from 192.168.1.1: icmp_seq=49 ttl=254 time=7000.388 ms 64 bytes from 192.168.1.1: icmp_seq=50 ttl=254 time=6000.217 ms 64 bytes from 192.168.1.1: icmp_seq=51 ttl=254 time=5000.084 ms 64 bytes from 192.168.1.1: icmp_seq=52 ttl=254 time=3999.920 ms 64 bytes from 192.168.1.1: icmp_seq=53 ttl=254 time=3000.010 ms 64 bytes from 192.168.1.1: icmp_seq=54 ttl=254 time=1999.832 ms 64 bytes from 192.168.1.1: icmp_seq=55 ttl=254 time=1000.072 ms 64 bytes from 192.168.1.1: icmp_seq=58 ttl=254 time=1.125 ms 64 bytes from 192.168.1.1: icmp_seq=59 ttl=254 time=1.070 ms 64 bytes from 192.168.1.1: icmp_seq=60 ttl=254 time=2.515 ms

    Read the article

  • Setting up and using Bing Translate API Service for Machine Translation

    - by Rick Strahl
    Last week I spent quite a bit of time trying to set up the Bing Translate API service. I can honestly say this was one of the most screwed up developer experiences I've had in a long while - specifically related to the byzantine sign up process that Microsoft has in place. Not only is it nearly impossible to find decent documentation on the required signup process, some of the links in the docs are just plain wrong, and some of the account pages you need to access the actual account information once signed up are not linked anywhere from the administration UI. To make things even harder is the fact that the APIs changed a while back, with a completely new authentication scheme that's described and not directly linked documentation topic also made for a very frustrating search experience. It's a bummer that this is the case too, because the actual API itself is easy to use and works very well - fast and reasonably accurate (as accurate as you can expect machine translation to be). But the sign up process is a pain in the ass doubtlessly leaving many people giving up in frustration. In this post I'll try to hit all the points needed to set up to use the Bing Translate API in one place since such a document seems to be missing from Microsoft. Hopefully the API folks at Microsoft will get their shit together and actually provide this sort of info on their site… Signing Up The first step required is to create a Windows Azure MarketPlace account. Go to: https://datamarket.azure.com/ Sign in with your Windows Live Id If you don't have an account you will be taken to a registration page which you have to fill out. Follow the links and complete the registration. Once you're signed in you can start adding services. Click on the Data Link on the main page Select Microsoft Translator from the list This adds the Microsoft Bing Translator to your services. Pricing The page shows the pricing matrix and the free service which provides 2 megabytes for translations a month for free. Prices go up steeply from there. Pricing is determined by actual bytes of the result translations used. Max translations are 1000 characters so at minimum this means you get around 2000 translations a month for free. However most translations are probable much less so you can expect larger number of translations to go through. For testing or low volume translations this should be just fine. Once signed up there are no further instructions and you're left in limbo on the MS site. Register your Application Once you've created the Data association with Translator the next step is registering your application. To do this you need to access your developer account. Go to https://datamarket.azure.com/developer/applications/register Provide a ClientId, which is effectively the unique string identifier for your application (not your customer id!) Provide your name The client secret was auto-created and this becomes your 'password' For the redirect url provide any https url: https://microsoft.com works Give this application a description of your choice so you can identify it in the list of apps Now, once you've registered your application, keep track of the ClientId and ClientSecret - those are the two keys you need to authenticate before you can call the Translate API. Oddly the applications page is hidden from the Azure Portal UI. I couldn't find a direct link from anywhere on the site back to this page where I can examine my developer application keys. To find them you can go to: https://datamarket.azure.com/developer/applications You can come back here to look at your registered applications and pick up the ClientID and ClientSecret. Fun eh? But we're now ready to actually call the API and do some translating. Using the Bing Translate API The good news is that after this signup hell, using the API is pretty straightforward. To use the translation API you'll need to actually use two services: You need to call an authentication API service first, before you can call the actual translator API. These two APIs live on different domains, and the authentication API returns JSON data while the translator service returns XML. So much for consistency. Authentication The first step is authentication. The service uses oAuth authentication with a  bearer token that has to be passed to the translator API. The authentication call retrieves the oAuth token that you can then use with the translate API call. The bearer token has a short 10 minute life time, so while you can cache it for successive calls, the token can't be cached for long periods. This means for Web backend requests you typically will have to authenticate each time unless you build a more elaborate caching scheme that takes the timeout into account (perhaps using the ASP.NET Cache object). For low volume operations you can probably get away with simply calling the auth API for every translation you do. To call the Authentication API use code like this:/// /// Retrieves an oAuth authentication token to be used on the translate /// API request. The result string needs to be passed as a bearer token /// to the translate API. /// /// You can find client ID and Secret (or register a new one) at: /// https://datamarket.azure.com/developer/applications/ /// /// The client ID of your application /// The client secret or password /// public string GetBingAuthToken(string clientId = null, string clientSecret = null) { string authBaseUrl = https://datamarket.accesscontrol.windows.net/v2/OAuth2-13; if (string.IsNullOrEmpty(clientId) || string.IsNullOrEmpty(clientSecret)) { ErrorMessage = Resources.Resources.Client_Id_and_Client_Secret_must_be_provided; return null; } var postData = string.Format("grant_type=client_credentials&client_id={0}" + "&client_secret={1}" + "&scope=http://api.microsofttranslator.com", HttpUtility.UrlEncode(clientId), HttpUtility.UrlEncode(clientSecret)); // POST Auth data to the oauth API string res, token; try { var web = new WebClient(); web.Encoding = Encoding.UTF8; res = web.UploadString(authBaseUrl, postData); } catch (Exception ex) { ErrorMessage = ex.GetBaseException().Message; return null; } var ser = new JavaScriptSerializer(); var auth = ser.Deserialize<BingAuth>(res); if (auth == null) return null; token = auth.access_token; return token; } private class BingAuth { public string token_type { get; set; } public string access_token { get; set; } } This code basically takes the client id and secret and posts it at the oAuth endpoint which returns a JSON string. Here I use the JavaScript serializer to deserialize the JSON into a custom object I created just for deserialization. You can also use JSON.NET and dynamic deserialization if you are already using JSON.NET in your app in which case you don't need the extra type. In my library that houses this component I don't, so I just rely on the built in serializer. The auth method returns a long base64 encoded string which can be used as a bearer token in the translate API call. Translation Once you have the authentication token you can use it to pass to the translate API. The auth token is passed as an Authorization header and the value is prefixed with a 'Bearer ' prefix for the string. Here's what the simple Translate API call looks like:/// /// Uses the Bing API service to perform translation /// Bing can translate up to 1000 characters. /// /// Requires that you provide a CLientId and ClientSecret /// or set the configuration values for these two. /// /// More info on setup: /// http://www.west-wind.com/weblog/ /// /// Text to translate /// Two letter culture name /// Two letter culture name /// Pass an access token retrieved with GetBingAuthToken. /// If not passed the default keys from .config file are used if any /// public string TranslateBing(string text, string fromCulture, string toCulture, string accessToken = null) { string serviceUrl = "http://api.microsofttranslator.com/V2/Http.svc/Translate"; if (accessToken == null) { accessToken = GetBingAuthToken(); if (accessToken == null) return null; } string res; try { var web = new WebClient(); web.Headers.Add("Authorization", "Bearer " + accessToken); string ct = "text/plain"; string postData = string.Format("?text={0}&from={1}&to={2}&contentType={3}", HttpUtility.UrlEncode(text), fromCulture, toCulture, HttpUtility.UrlEncode(ct)); web.Encoding = Encoding.UTF8; res = web.DownloadString(serviceUrl + postData); } catch (Exception e) { ErrorMessage = e.GetBaseException().Message; return null; } // result is a single XML Element fragment var doc = new XmlDocument(); doc.LoadXml(res); return doc.DocumentElement.InnerText; } The first of this code deals with ensuring the auth token exists. You can either pass the token into the method manually or let the method automatically retrieve the auth code on its own. In my case I'm using this inside of a Web application and in that situation I simply need to re-authenticate every time as there's no convenient way to manage the lifetime of the auth cookie. The auth token is added as an Authorization HTTP header prefixed with 'Bearer ' and attached to the request. The text to translate, the from and to language codes and a result format are passed on the query string of this HTTP GET request against the Translate API. The translate API returns an XML string which contains a single element with the translated string. Using the Wrapper Methods It should be pretty obvious how to use these two methods but here are a couple of test methods that demonstrate the two usage scenarios:[TestMethod] public void TranslateBingWithAuthTest() { var translate = new TranslationServices(); string clientId = DbResourceConfiguration.Current.BingClientId; string clientSecret = DbResourceConfiguration.Current.BingClientSecret; string auth = translate.GetBingAuthToken(clientId, clientSecret); Assert.IsNotNull(auth); string text = translate.TranslateBing("Hello World we're back home!", "en", "de",auth); Assert.IsNotNull(text, translate.ErrorMessage); Console.WriteLine(text); } [TestMethod] public void TranslateBingIntegratedTest() { var translate = new TranslationServices(); string text = translate.TranslateBing("Hello World we're back home!","en","de"); Assert.IsNotNull(text, translate.ErrorMessage); Console.WriteLine(text); } Other API Methods The Translate API has a number of methods available and this one is the simplest one but probably also the most common one that translates a single string. You can find additional methods for this API here: http://msdn.microsoft.com/en-us/library/ff512419.aspx Soap and AJAX APIs are also available and documented on MSDN: http://msdn.microsoft.com/en-us/library/dd576287.aspx These links will be your starting points for calling other methods in this API. Dual Interface I've talked about my database driven localization provider here in the past, and it's for this tool that I added the Bing localization support. Basically I have a localization administration form that allows me to translate individual strings right out of the UI, using both Google and Bing APIs: As you can see in this example, the results from Google and Bing can vary quite a bit - in this case Google is stumped while Bing actually generated a valid translation. At other times it's the other way around - it's pretty useful to see multiple translations at the same time. Here I can choose from one of the values and driectly embed them into the translated text field. Lost in Translation There you have it. As I mentioned using the API once you have all the bureaucratic crap out of the way calling the APIs is fairly straight forward and reasonably fast, even if you have to call the Auth API for every call. Hopefully this post will help out a few of you trying to navigate the Microsoft bureaucracy, at least until next time Microsoft upends everything and introduces new ways to sign up again. Until then - happy translating… Related Posts Translation method Source on Github Translating with Google Translate without Google API Keys Creating a data-driven ASP.NET Resource Provider© Rick Strahl, West Wind Technologies, 2005-2013Posted in Localization  ASP.NET  .NET   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • Oracle Data Integration Solutions and the Oracle EXADATA Database Machine

    - by João Vilanova
    Oracle's data integration solutions provide a complete, open and integrated solution for building, deploying, and managing real-time data-centric architectures in operational and analytical environments. Fully integrated with and optimized for the Oracle Exadata Database Machine, Oracle's data integration solutions take data integration to the next level and delivers extremeperformance and scalability for all the enterprise data movement and transformation needs. Easy-to-use, open and standards-based Oracle's data integration solutions dramatically improve productivity, provide unparalleled efficiency, and lower the cost of ownership.You can watch a video about this subject, after clicking on the link below.DIS for EXADATA Video

    Read the article

  • finite state machine used in mario like platform game

    - by juakob
    I dont understand how to use a finite state machine with the entity controlled by the player. For example i have a mario(2d platform) i can jump,run,walk,take damage,swim,etc so my first thought was to use this actions as states. But what happen when you are running when you take damage? or jumping taking damage and shooting at the same time? I just want to add functionalities(actions) to the player in a clean way(not using ifs for all the actions in the entity update)

    Read the article

  • Using ext4 in VMware machine

    First of all, using a journaling filesystems like NTFS, ext4, XFS, or JFS (not to name all of them) is a very good idea and nowadays unthinkable not to do. Linux offers a good variety of different option as journaling filesystem for your system. Since years I am using SGI's XFS and I am pretty confident with stability, performance and liability of the system. In earlier years I had to struggle with incompatibilities between XFS and the boot loader. Using an ext2 formatted /boot solved this issue. But, wow, that is ages ago! Lately, I had to setup a fresh Lucid Lynx (Ubuntu 10.04 LTS) system for a change of our internal groupware / messaging system. Therefore, I fired up a new virtual machine with almost standard configuration in VMware Server and run through our network-based PXE boot and installation procedure. At a certain step in this process, Ubuntu asks you about the partitioning of your hard drive(s). Honestly, I have to say that only out of curiousity I sticked to the "default" suggestion and gave my faith and trust into the Ubuntu installation routine... Resulting to have an ext4 based root mount point ( / ). The rest of the installation went on without further concerns or worries. Note:I really can't remember why I chose to go away from my favourite... Well, it should turn out to be the wrong decision after all. Ok, let's continue the story about ext4 in a VMware based virtual machine. After some hours installing additional packages and configuring the new system using LDAP for general authentication and login, I had an "out-of-the-box" usable enterprise messaging system based on Zarafa 6.40 Community Edition inclusive proper SSL-based Webaccess interface and Z-Push extension for ActiveSync with my Nokia mobile. Straightforward and pretty nice for the time spent on the setup. Having priority on other tasks I let the system just running and didn't pay any further attention at all. Until I run into an upgrade of "Mail for Exchange" on Symbian OS. My mobile did not bother me at all with the upgrade and everything went smooth, but trying to re-establish the ActiveSync connection to the Zarafa messaging system resulted in a frustating situation. So, I shifted my focus back to the Linux system and I was amazed to figure out that the root had been remounted readonly due to hard drive failures or at least ext4 reported errors. Firing up Google only confirmed my concerns and it seems that using ext4 for VMware based virtual machines does not look like a stable and reliable candidate to me. You might consider reading those external resources: ext4 fs corruption under VMWare Server 2.01Bug #389555 - ext4 filesystem corruption Well, I learned my lesson and ext{2|3|4} based filesystems are not going to be used on any of my Linux systems or customer installations in the future. Addendum: I did not try this setup in other virtualization environments like VirtualBox, qemu, kvm, Xen, etc.

    Read the article

  • Virtual Machine Storage Provisioning and best practises

    If you're using Virtualization technology, then at some point you'll have run out of (or will run out of) virtual disk space, & had to provision extra storage; are you confident that you know how to do that? Sean Duffy makes sure you're doing it right, sharing his recommendations and tips in this step-by-step guide to Virtual Machine Storage provisioning for VMware. Follow this advice, and you'll be a Virtualization Veteran in no time.

    Read the article

  • Programming Language, Turing Completeness and Turing Machine

    - by Amumu
    A programming language is said to be Turing Completeness if it can successfully simulate a universal TM. Let's take functional programming language for example. In functional programming, function has highest priority over anything. You can pass functions around like any primitives or objects. This is called first class function. In functional programming, your function does not produce side effect i.e. output strings onto screen, change the state of variables outside of its scope. Each function has a copy of its own objects if the objects are passed from the outside, and the copied objects are returned once the function finishes its job. Each function written purely in functional style is completely independent to anything outside of it. Thus, the complexity of the overall system is reduced. This is referred as referential transparency. In functional programming, each function can have its local variables kept its values even after the function exits. This is done by the garbage collector. The value can be reused the next time the function is called again. This is called memoization. A function usually should solve only one thing. It should model only one algorithm to answer a problem. Do you think that a function in a functional language with above properties simulate a Turing Machines? Functions (= algorithms = Turing Machines) are able to be passed around as input and returned as output. TM also accepts and simulate other TMs Memoization models the set of states of a Turing Machine. The memorized variables can be used to determine states of a TM (i.e. which lines to execute, what behavior should it take in a give state ...). Also, you can use memoization to simulate your internal tape storage. In language like C/C++, when a function exits, you lose all of its internal data (unless you store it elsewhere outside of its scope). The set of symbols are the set of all strings in a programming language, which is the higher level and human-readable version of machine code (opcode) Start state is the beginning of the function. However, with memoization, start state can be determined by memoization or if you want, switch/if-else statement in imperative programming language. But then, you can't Final accepting state when the function returns a value, or rejects if an exception happens. Thus, the function (= algorithm = TM) is decidable. Otherwise, it's undecidable. I'm not sure about this. What do you think? Is my thinking true on all of this? The reason I bring function in functional programming because I think it's closer to the idea of TM. What experience with other programming languages do you have which make you feel the idea of TM and the ideas of Computer Science in general? Can you specify how you think?

    Read the article

  • Best way to remote restart Ubuntu from Windows machine

    - by robsoft
    Background: I'm looking to put a series of Ubuntu machines into retail locations, they're being used as dumb kiosks to show a series of slides onto large LCD panel TV screens. Once installed, they won't have a keyboard or mouse connected but will have a fixed IP on the local network. Everything is configured to auto-start, no automatic updates, no power saving etc - I think we're pretty-much good to go apart from one thing. I need the retail staff to be able to restart the boxes if a problem arises. We have VNC running (now that we've turned off desktop enhancements!) so that we can remotely get into the machines if we need to, but that's not something we would allow the retail staff to do. The machines are going to be physically 'out of the way' (probably in the ceiling space) so the power button is not easily accessible!. I'd like to have some means of allowing the retail staff to restart the Ubuntu machine, from the desktop of one of their Windows terminals. I don't really want to give them some kind of raw terminal access (the command line will frighten them!) and I don't want them to use VNC (as stated above). Ideally there would be an icon on the Windows desktop, they double-click it, reply to a simple 'are you sure?' prompt, and then the Ubuntu box is told to restart. The Windows side of that won't be a problem, we can write something using Delphi, Python & Qt4, whatever - it's the Ubuntu side of it I'm stuck with. Out of sight/view, could I have a Windows program open a terminal across the network and tell Ubuntu to restart? Is this what SSH could be used for (I have never set that kind of thing up). The Windows programming side isn't really an issue, it's just that I'm a total Ubuntu noob and don't know where to start from the platform point of view. The other thing we considered is also having the machine automatically restart itself at a set time each day (obviously out of store hours!). To me, that seems a bit unnecessary (though forcing a restart once a week/month might be worthwhile). Any thoughts or suggestions? Being able to restart the box on demand across the network is my prime requirement.

    Read the article

  • Right-Time Retail Part 1

    - by David Dorf
    This is the first in a three-part series. Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Right-Time Revolution Technology enables some amazing feats in retail. I can order flowers for my wife while flying 30,000 feet in the air. I can order my groceries in the subway and have them delivered later that day. I can even see how clothes look on me without setting foot in a store. Who knew that a TV, diamond necklace, or even a car would someday be as easy to purchase as a candy bar? Can technology make a mattress an impulse item? Wake-up and your back is hurting, so you rollover and grab your iPad, then a new mattress is delivered the next day. Behind the scenes the many processes are being choreographed to make the sale happen. This includes moving data between systems with the least amount for friction, which in some cases is near real-time. But real-time isn’t appropriate for all the integrations. Think about what a completely real-time retailer would look like. A consumer grabs toothpaste off the shelf, and all systems are immediately notified so that the backroom clerk comes running out and pushes the consumer aside so he can replace the toothpaste on the shelf. Such a system is not only cost prohibitive, but it’s also very inefficient and ineffectual. Retailers must balance the realities of people, processes, and systems to find the right speed of execution. That’ what “right-time retail” means. Retailers used to sell during the day and count the money and restock at night, but global expansion and the Web have complicated that simplistic viewpoint. Our 24hr society demands not only access but also speed, which constantly pushes the boundaries of our IT systems. In the last twenty years, there have been three major technology advancements that have moved us closer to real-time systems. Networking is the first technology that drove the real-time trend. As systems became connected, it became easier to move data between them. In retail we no longer had to mail the daily business report back to corporate each day as the dial-up modem could transfer the data. That was soon replaced with trickle-polling, when sale transactions were occasionally sent from stores to corporate throughout the day, often through VSAT. Then we got terrestrial networks like DSL and Ethernet that allowed the constant stream of data between stores and corporate. When corporate could see the sales transactions coming from stores, it could better plan for replenishment and promotions. That drove the need for speed into the supply chain and merchandising, but for many years those systems were stymied by the huge volumes of data. Nordstrom has 150 million SKU/Store combinations when planning (RPAS); The Gap generates 110 million price changes during end-of-season (RPM); Argos does 1.78 billion calculations executed each day for replenishment planning (AIP). These areas are now being alleviated by the second technology, storage. The typical laptop disk drive runs at 5,400rpm with PCs stepping up to 7,200rpm and servers hitting 15,000rpm. But the platters can only spin so fast, so to squeeze more performance we’ve had to rely on things like disk striping. Then solid state drives (SSDs) were introduced and prices continue to drop. (Augmenting your harddrive with a SSD is the single best PC upgrade these days.) RAM continues to be expensive, but compressing data in memory has allowed more efficient use. So a few years back, Oracle decided to build a box that incorporated all these advancements to move us closer to real-time. This family of products, often categorized as engineered systems, combines the hardware and software so that they work together to provide better performance. How much better? If Exadata powered a 747, you’d go from New York to Paris in 42 minutes, and it would carry 5,000 passengers. If Exadata powered baseball, games would last only 18 minutes and Boston’s Fenway would hold 370,000 fans. The Exa-family enables processing more data in less time. So with faster networks and storage, that brings us to the third and final ingredient. If we continue to process data in traditional ways, we won’t be able to take advantage of the faster networks and storage. Enter what Harvard calls “The Sexiest Job of the 21st Century” – the data scientist. New technologies like the Hadoop-powered Oracle Big Data Appliance, Oracle Advanced Analytics, and Oracle Endeca Information Discovery change the way in which we organize data. These technologies allow us to extract actionable information from raw data at incredible speeds, often ad-hoc. So the foundation to support the real-time enterprise exists, but how does a retailer begin to take advantage? The most visible way is through real-time marketing, but I’ll save that for part 3 and instead begin with improved integrations for the assets you already have in part 2.

    Read the article

  • Remote Desktop from a ubuntu 13.04 to an Ubuntu 13.04 machine so the user on the second machine can see my movments

    - by user163169
    I would like to remote desktop/VPN from an Ubuntu 13.04 computer (a) to an Ubuntu 13.04 computer(b) so the user(s) on the second machine can see my movements. I would like something a lot like team-viewer or Join.me but these machines do not have Internet but that are attached on a local network and I can VPN to them but that can not see what I am doing and I need them to be able to see my movements.

    Read the article

  • How to transfer Windows Vista disk image to new machine

    - by Mike Hobbs
    I'm trying to upgrade a user's machine to some better hardware. I know of Easy Transfer, but I'd rather not have to reinstall all the programs that are already present. (Some of which are no longer available, anyway). Instead, I'm trying to transfer the entire disk image from one machine to the other, but I ran into issues. If I copy the partition image over to the new machine using Clonezilla, I get errors on boot saying that I need to insert the Vista install disk and run repair. I do that, but it then says that it is unable to repair whatever it is that's broken. Next, I tried to sysprep the old machine before creating the image, but sysprep fails saying that it encountered some sort of system error. Should it be possible to sysprep any arbitrary machine, or does it only work on a relatively clean install? Could it be a missing driver that is tripping me up? The new machine is a fairly stock desktop that shouldn't need any special drivers beyond what's already present in standard Vista. Are there any foolproof methods for doing this sort of thing?

    Read the article

  • TimeZone Issue during DayLight Saving

    - by Viren
    I just been bugged by the Day light saving hours I seem that 3rd November 2013 01:00:00 start EST time Now ever Time I set my time to 3rd November 2013 00:58:xx(some seconds) and run date it give me valid Time zone i.e EDT but even after the time pass 01:00:00 and I still query the date library I still see the Time zone as EDT and not EST have a look at this screenshot You can clearly see the Time zone saying as EDT even when it is EST any one has a clue for this Update There is one other finding I found if I restart my machine I see this More Update Before Restart After Restart

    Read the article

  • TimeZone Issue during DayLight Saving

    - by user1328293
    I just been bugged by the Day light saving hours I seem that 3rd November 2013 01:00:00 start EST time Now ever Time I set my time to 3rd November 2013 00:58:xx(some seconds) and run date it give me valid Time zone i.e EDT but even after the time pass 01:00:00 and I still query the date library I still see the Time zone as EDT and not EST have a look at this screenshot You can clearly see the Time zone saying as EDT even when it is EST any one has a clue for this Update There is one other finding I found if I restart my machine I see this

    Read the article

< Previous Page | 34 35 36 37 38 39 40 41 42 43 44 45  | Next Page >