Search Results

Search found 19365 results on 775 pages for 'machine vision'.

Page 139/775 | < Previous Page | 135 136 137 138 139 140 141 142 143 144 145 146  | Next Page >

  • Copy/Paste (Clipboard like) functionality from a VNC desktop

    - by goldenmean
    I use TightVNC, RealVNC to access the Remote Linux(CentOS,Ubuntu) desktops by running a vncserver on the remote machine. When I connect to those servers from my Windows host: Can I use the Copy/Paste functionality on the VNC client desktop window. i.e. Copy some text on some terminal open inside the VNC client desktop window and paste it into some text file on my local windows machhines and vice-a-versa? I checked TightVNC options but did not see anything. Can it be done by a)Running vncserver on the remote machine with some options or b)By running some configuration on remote machine to enable this. How could I get it done? Also is this kind of Copy/Paste functionality possible in Microsoft Windows Remote Desktop connection/Terminal session protocol?

    Read the article

  • No endpoint listening at.........

    - by Michael Stephenson
    I was having some very frustrating behaviour on our build server and while I found a number of articles online with similar error messages none of them helped me.  I thought I would just explain this here incase if helps me or anyone else in future.The error message we were getting is:There was no endpoint listening at http://localhostStubs.ExternalApplication/SampleService.svc that could accept the message. This is often caused by an incorrect address or SOAP action. See InnerException, if present, for more detailsOur scenario is as follows:We have a solution where a WCF service application hosting the WCF routing service is listening to the Windows Azure Service Bus Relay.  We have an acceptance test project in the solution which sends a message to the service bus which is then received by the WCF routing service and routed to SampleService.svc which is hosted in another IIS application on the same box.  A response is flowed back through to the test.  In the tests there are 5 scenarios simulating a successful message, and various error conditions.  On my developer machine it was working absolutely fine every time, and a clean build on my developer machine worked fine.  On the build server however one or more of the tests would fail each time with the above error message.  There didnt seem to be any pattern to which test would fail.The solution was building on a Windows 2008 R2 machine with IIS 7 and AppFabric Server installed with auto-start configured for the IIS Application which would be listening to service bus.After lots of searching online and looking at logs etc it turned out to be a simple solution to just restart the WAS service (Windows Process Activation Service) and the services it advised you to restart with it.  Hope this helps someone else

    Read the article

  • SQL CLR not properly enabling

    - by dnolan
    We have a SQL server running SQL 2005 Workgroup 64 bit (9.0.4273), on Windows 2003 server 64 bit. We have run sp_configure and reconfigured the server which indicates that the clr is now enabled. exec sp_configure 'clr enabled', '1' go reconfigure go However, when trying to call CREATE ASSEMBLY the server completely dies on us and we have to do a full reboot of the machine. A little more diagnostic information, even though clr enabled is set to 1 and we have rebooted the full server, running the following statement select * from sys.dm_clr_properties returns directory version state locked CLR version with mscoree which is what it says when the CLR is not enabled on another machine. On a correctly enabled machine (after reboot) this function reads directory C:\Windows\Microsoft.NET\Framework64\v2.0.50727\ version v2.0.50727 state CLR is initialized

    Read the article

  • Configure static IP with port number which will point to multiple projects on different ports

    - by Yogesh Kadam
    I am developing a project in LAM* and using the Symfony framework. I have one static IP like 99.99.99.99:8000 which points on my Linux server machine. This static IP already has port number of 8000. This Linux server machine has multiple project hosted on it and we access each project in LAN with different port number like abc:81, pqr:82, xyz:83. Is is possible to access each project on same Linux machine by this static IP? If yes then please let me know how to configure and call each project using this IP address.

    Read the article

  • Expired Windows Server 2008 trial remote activation

    - by Garry Harthill
    I have a remote server which has an expired license. I can't currently add the new license because of a licensing server error so I need to reboot the machine. My question is will I be able to remote desktop onto the machine and apply the new license or do I have to be physically at the server. I know an activation wizard usually appears and doesn't let you go any further but does this same wizard appear over remote desktop? I don't really want to reboot the machine only for it to not come up again.

    Read the article

  • How to change the screen resolution in VNC viewer for Ubuntu 12.04 without a monitor?

    - by user325320
    I have Ubuntu 12.04 installed on a machine and I always use it remotely from VNC. When I have monitor connected to this machine, I can change the resolution of my VNC viewer in the following line: $vnc4server --geometry 1440x900 This worked for me, but I always use this machine remotely, I unplug the monitor and reboot. and the above command line not work anymore. Then I tried xrandr SZ: Pixels Physical Refresh *0 1024 x 768 ( 260mm x 195mm ) *60 Current rotation - normal Current reflection - none Rotations possible - normal Reflections possible - none There is only one option available, so I tried to add a new one. $cvt 1440 900 # 1440x900 59.89 Hz (CVT 1.30MA) hsync: 55.93 kHz; pclk: 106.50 MHz Modeline "1440x900_60.00" 106.50 1440 1528 1672 1904 900 903 909 934 -hsync +vsync $xrandr --newmode "1440x900_60.00" 106.50 1440 1528 1672 1904 900 903 909 934 -hsync +vsync $xrandr --addmode S2 "1440x900_60.00" then I checked with xrandr again and can't see the new mode added. I try to execute the following command and get error says my RandR is too old. $xrandr --output S2 --mode 1440x900_60.00 xrandr: Server RandR version before 1.2 but this does not make sense to me, if I plug in the monitor back and run the xrandr command, it works again! It seems that Ubuntu must conntect to a real monitor before I can change my resolution in my VNC viewer. Can anyone help? UPDATE: Finally I solved this problem by changing to tightvncserver $tightvncserver -geometry 1440x900 works for me. Thanks everything answered my question

    Read the article

  • Receiving Event ID: 10107, Hyper-V -VMMS

    - by Stargaten
    We are using physical disk on two of Guest operating systems. Is this a know issue? Do we need to have DPM 2010? "One or more physical disks are attached to virtual machine 'Myserver'. Back up programs that use the Hyper-V VSS writer cannot back up volumes that are attached to virtual machines as physical disks. To avoid potential data loss, use another method to back up the data on the physical disks. If you restore the data on this virtual machine, make sure to check the data of the physical disk for integrity. (Virtual machine ID 8EF3C0CB-967D-4D67-B4D8-7B782C7AC07C)"

    Read the article

  • PXE boot very slow when PXE server is virtualbox

    - by sqrtsben
    As I read in questions here and on the Internet, PXE and Virtualbox don't seem to like each other too much. My problem is the following: I have a virtualized machine which hosts the DHCP and PXE server for 10 native clients. They are rebooted roughly every 10 mins and on each reboot, they need to boot a small linux (the initrd is ~4MB). Before, I had a native machine running and booting via PXE was very fast. Now, looking at the output of nload, I only get 500kbit/s whenever one machine is booting. The machines are connected via a GBit switch, so that can't be it. Also, when testing the connection speed to the outside, I have the full bandwidth available. Is VBox just unable to deal with large amounts of UDP packets? Can anyone point me in the right direction here?

    Read the article

  • How can I make it difficult to install a new operating system on a certain computer?

    - by D W
    I want to host a website on a desktop computer running Ubuntu with a Windows virtual machine. I will give away the computer in exchange for a number of months of remote web hosting. I want to add some kind of lock (hardware or otherwise) so that the end users will have difficulty just reinstalling Windows and using the machine as they want, in contradiction to the contract. Ideally, I'd want the machine to die if reinstallation of the OS is attempted. It doesn't have to be completely insurmountable, but it has to be difficult enough to prevent casual reinstallation. Perhaps on bootup the system can check whether certain files exist on the computer and refuse to boot if they do not. I don't know if this is possible, but maybe BIOS is password protected, and searches for files before boot up. The files it looks for could be date sensitive, i.e. require remote replacement on a schedule.

    Read the article

  • Snippets between desktop and laptop

    - by Jamie F
    The Situation: At work, I have a nice beefy desktop running Windows Server 2008 R2 (SharePoint dev machine). My handy ThinkPad is right next to it. Every once in a while I'd like to cut and paste or share something (usually text) between the machines: for example, I might be headed out and I'd like to take send the URL I'm reading from the desktop to the laptop. Of course I can create a share or use the Admin shares and create files to get stuff back and forth, but that seems heavyweight for what I'm thinking of. I'm thinking more along the lines of sending myself an IM. How do you get little things from machine to machine? Keep a shared folder pinned to the taskbar? Send an email to yourself? Bookmark sync? While on it, I'm looking for a decent multiple clipboard handler: maybe these two functions are combined in some nice little utility? I suspect I'm missing something simple here... Thanks... Jamie F.

    Read the article

  • Sharing an Apache configuration between testing vs. production

    - by Kevin Reid
    I have a personal web site with a slightly nontrivial Apache configuration. I test changes on my personal machine before uploading them to the server. The path to the files on disk and the root URL of the site are of course different between the test and production conditions, and they occur many places in the configuration (especially <Directory blocks for special locations which have scripts or no directory listing or ...). What is the best way to share the common elements of the configuration, to make sure that my production environment matches my test environment as closely as possible? What I've thought of is to use SetEnv to store the paths for the current machine in environment variables, then Include a common configuration file with ${} everywhere there's something machine specific. Any hazards of this method?

    Read the article

  • rdesktop over ssh

    - by Dan
    In Ubuntu, I'm trying to log into my friend's windows machine using rdesktop. First, I can log into his outward facing linux box using ssh. Then from there I can log into his linux host machine using ssh. This host machine is running Windows XP inside virtualbox. Is there a way for me to tunnel rdesktop through these two ssh connections (may just need the first connection to the outward facing linux box just to get inside the network, depending on how virtualbox's network connection is set up). Thanks

    Read the article

  • RDP and New Accounts

    - by leeand00
    I created a new user account on the domain and added them to the Remote Desktop Users group. I could login just fine locally, but when I logged in remotely I was basically told that I could not login from there using that user. I could login just fine as the administrator or anybody else other than that new account. So I researched it a bit more and found that my setting looked like this on the local machine: So I changed it to Allow connections only from computers running Remote Desktop with Network Level Authentication (NLA). Now when I tried this down at my office I connected with RDP just fine on another computer. But low and behold when I got home and simply try to connect to the machine, I get the message: There has to be some kind of in between setting, or additional setting that I need to change on the user that allows me to connect directly via remote desktop over the VPN. At the moment I can connect by connecting to another computer on the network and then RDPing from there into my machine, but this is not ideal.

    Read the article

  • OSX 10.8 Corrupted User Account Using Launchctl

    - by Scott
    I used the following command: launchctl unload -w /System/Library/LaunchAgents/com.apple.notificationcenterui.plist in an attempt to disable notification center. I'm not sure that I got all of the commands right and appear to have corrupted the account that I executed it from - I get a grey screen when I try to login on that account. Fortunately I have another account on the machine with admin privileges so I can still use the machine. I would however like to restore the account to a working condition preferably without having to resort to a complete system restore from my time machine backup. Is there a way of diagnosing the current status of this launchagent and returning it to its original state?

    Read the article

  • Ubuntu VM Guest - Samba Service Not Accessible from VM Host via Hostname

    - by phalacee
    I have a Windows 7 Workstation with a Ubuntu 10.10 VM running in Virtual Box 3.2.12 r68302. I recently updated Samba and winbind, and since the update, I am unable to access the machine via it's hostname (\mystique) from the VM Host. I can access it by the "Host-only" IP (\192.168.56.101) and the DHCP Assigned IP address (\10.1.1.20) and I can connect to the webserver on the machine via it's hostname (http://mystique/). As stated, accessing this machine via it's hostname worked fine prior to the update, but has since stopped working. I have added the hostname to the smb.conf for the netbios name, to no avail. My smb.conf [global] section looks like this: workgroup = NETWORK netbios name = Mystique server string = %h server (Samba, Ubuntu) dns proxy = no log file = /var/log/samba/log.%m max log size = 1000 syslog = 0 panic action = /usr/share/samba/panic-action %d encrypt passwords = true passdb backend = tdbsam obey pam restrictions = yes unix password sync = yes passwd program = /usr/bin/passwd %u passwd chat = *Enter\snew\s*\spassword:* %n\n *Retype\snew\s*\spassword:* %n\n *password\supdated\ssuccessfully* . pam password change = yes map to guest = bad user usershare allow guests = yes

    Read the article

  • Windows Azure Evolution - Web Sites (aka Antares) Part 1

    - by Shaun
    This is the 3rd post of my Windows Azure Evolution series, focus on the new features and enhancement which was alone with the Windows Azure Platform Upgrade June 2012, announced at the MEET Windows Azure event on 7th June. In the first post I introduced the new preview developer portal and how to works for the existing features such as cloud services, storages and SQL databases. In the second one I talked about the Windows Azure .NET SDK 1.7 on the latest Visual Studio 2012 RC on Windows 8. From this one I will begin to introduce some new features. Now let’s have a look on the first one of them, Windows Azure Web Sites.   Overview Windows Azure Web Sites (WAWS), as known as Antares, was a new feature still in preview stage in this upgrade. It allows people to quickly and easily deploy websites to a highly scalable cloud environment, uses the languages and open source apps of the choice then deploy such as FTP, Git and TFS. It also can be integrated with Windows Azure services like SQL Database, Caching, CDN and Storage easily. After read its introduction we may have a question: since we can deploy a website from both cloud service web role and web site, what’s the different between them? So, let’s have a quick compare.   CLOUD SERVICE WEB SITE OS Windows Server Windows Server Virtualization Windows Azure Virtual Machine Windows Azure Virtual Machine Host IIS IIS Platform ASP.NET WebForm, ASP.NET MVC, WCF ASP.NET WebForm, ASP.NET MVC, PHP Language C#, VB.NET C#, VB.NET, PHP Database SQL Database SQL Database, MySQL Architecture Multi layered, background worker, message queuing, etc.. Simple website with backend database. VS Project Windows Azure Cloud Service ASP.NET Web Form, ASP.NET MVC, etc.. Out-of-box Gallery (none) Drupal, DotNetNuke, WordPress, etc.. Deployment Package upload, Visual Studio publish FTP, Git, TFS, WebMatrix Compute Mode Dedicate VM Shared Across VMs, Dedicate VM Scale Scale up, scale out Scale up, scale out As you can see, there are many difference between the cloud service and web site, but the main point is that, the cloud service focus on those complex architecture web application. For example, if you want to build a website with frontend layer, middle business layer and data access layer, with some background worker process connected through the message queue, then you should better use cloud service, since it provides full control of your code and application. But if you just want to build a personal blog or a  business portal, then you can use the web site. Since the web site have many galleries, you can create them even without any coding and configuration. David Pallmann have an awesome figure explains the benefits between the could service, web site and virtual machine.   Create a Personal Blog in Web Site from Gallery As I mentioned above, one of the big feature in WAWS is to build a website from an existing gallery, which means we don’t need to coding and configure. What we need to do is open the windows azure developer portal and click the NEW button, select WEB SITE and FROM GALLERY. In the popping up windows there are many websites we can choose to use. For example, for personal blog there are Orchard CMS, WordPress; for CMS there are DotNetNuke, Drupal 7, mojoPortal. Let’s select WordPress and click the next button. The next step is to configure the web site. We will need to specify the DNS name and select the subscription and region. Since the WordPress uses MySQL as its backend database, we also need to create a MySQL database as well. Windows Azure Web Sites utilize ClearDB to host the MySQL databases. You cannot create a MySQL database directly from SQL Databases section. Finally, since we selected to create a new MySQL database we need to specify the database name and region in the last step. Also we need to accept the ClearDB’s terms as well. Then windows azure platform will download the WordPress codes and deploy the MySQL database and website. Then it will be ready to use. Select the website and click the BROWSE button, the WordPress administration page will be shown. After configured the WordPress here is my personal web blog on the cloud. It took me no more than 10 minutes to establish without any coding.   Monitor, Configure, Scale and Linked Resources Let’s click into the website I had just created in the portal and have a look on what we can do. In the website details page where are five sections. - Dashboard The overall information about this website, such as the basic usage status, public URL, compute mode, FTP address, subscription and links that we can specify the deployment credentials, TFS and Git publish setting, etc.. - Monitor Some status information such as the CPU usage, memory usage etc., errors, etc.. We can add more metrics by clicking the ADD METRICS button and the bottom as well. - Configure Here we can set the configurations of our website such as the .NET and PHP runtime version, diagnostics settings, application settings and the IIS default documents. - Scale This is something interesting. In WAWS there are two compute mode or called web site mode. One is “shared”, which means our website will be shared with other web sites in a group of windows azure virtual machines. Each web site have its own process (w3wp.exe) with some sandbox technology to isolate from others. When we need to scaling-out our web site in shared mode, we actually increased the working process count. Hence in shared mode we cannot specify the virtual machine size since they are shared across all web sites. This is a little bit different than the scaling mode of the cloud service (hosted service web role and worker role). The other mode called “dedicate”, which means our web site will use the whole windows azure virtual machine. This is the same hosting behavior as cloud service web role. In web role it will be deployed on the virtual machines we specified and all of them are only used by us. In web sites dedicate mode, it’s the same. In this mode when we scaling-out our web site we will use more virtual machines, and each of them will only host our own website. And we can specify the virtual machine size in this mode. In the developer portal we can select which mode we are using from the scale section. In shared mode we can only specify the instance count, but in dedicate mode we can specify the instance size as well as the instance count. - Linked Resource The MySQL database created alone with the creation of our WordPress web site is a linked resource. We can add more linked resources in this section.   Pricing For the web site itself, since this feature is in preview period if you are using shared mode, then you will get free up to 10 web sites. But if you are using dedicate mode, the price would be the virtual machines you are using. For example, if you are using dedicate and configured two middle size virtual machines then you will pay $230.40 per month. If there is SQL Database linked to your web site then they will be charged separately based on the Pay-As-You-Go price. For example a 1GB web edition database costs $9.99 per month. And the bandwidth will be charged as well. For example 10GB outbound data transfer costs $1.20 per month. For more information about the pricing please have a look at the windows azure pricing page.   Summary Windows Azure Web Sites gives us easier and quicker way to create, develop and deploy website to window azure platform. Comparing with the cloud service web role, the WAWS have many out-of-box gallery we can use directly. So if you just want to build a blog, CMS or business portal you don’t need to learn ASP.NET, you don’t need to learn how to configure DotNetNuke, you don’t need to learn how to prepare PHP and MySQL. By using WAWS gallery you can establish a website within 10 minutes without any lines of code. But in some cases we do need to code by ourselves. We may need to tweak the layout of our pages, or we may have a traditional ASP.NET or PHP web application which needed to migrated to the cloud. Besides the gallery WAWS also provides many features to download, upload code. It also provides the feature to integrate with some version control services such as TFS and Git. And it also provides the deploy approaches through FTP and Web Deploy. In the next post I will demonstrate how to use WebMatrix to download and modify the website, and how to use TFS and Git to deploy automatically one our code changes committed.   Hope this helps, Shaun All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • How to prevent recursive windows when connecting to vncserver on localhost

    - by blog.adaptivesoftware.biz
    I have a VNCServer (vino) configured on my Ubuntu 8.10 box. I would like to connect to this server from a vncclient running on this same machine (the reason for doing this strange thing is mentioned below). Understandably, when I connect to a vncserver on the same box, my vncclient shows recursive windows. Is there a way I can connect to the vncserver on the same machine and not have the recursive windows problem? Perhaps if I could start the vncserver on one display and the client on another display then will it work? How can I do something like this? Note - Reason for running vnc client and server on the same machine: When I start our Java Swing unit test suite, a bunch of swing UI's are created and destroyed as the tests run. These windows fly in the foreground making it impossible to work while the test suite is running. I am hoping to start the test suite within a vncclient so that I can continue working while the tests run.

    Read the article

  • Which browser does my computer use to open a Web page? [on hold]

    - by msh210
    I know little about networking the Internet, but, from what I understand, it works — very approximately — as follows: I, sitting at the computer example.com, send a message saying, roughly, "get http://s.tk" to my ISP, which passes the message along, eventually to the machine at s.tk. The s.tk machine gets "example.comhas sent 'gethttp://s.tk'", so sendssomefileto its ISP which passes the file along, eventually to the machine atexample.com`. When the file gets back to example.com, my computer, how does my computer know what to do with it? I'm sure the headers (or something else) indicate it's a Web page rather than, say, a Usenet post — that's not my question. My question is: how does it know whether to display the Web page in my open Opera window or my open Firefox window, or my other open Firefox window, or, heck, to open a new browser instance?

    Read the article

  • How to prevent Ubuntu from combining networks on 2 NIC server?

    - by SolarPower
    I've got a Ubuntu Server 10.10 with 2 network interfaces with a cable plugged into both going to switches on completely different networks with different routers. One network is the 10.1.10.X network with a separate gateway/router - the server has an IP of 10.1.10.50 with the gateway IP of 10.1.10.1. The other interface is 10.2.10.X, IP 10.2.10.50, gateway 10.2.10.1. All my Mac machines are on the 10.2.10.X network, and all servers on the 10.1.10.X. The ONLY connection between the two is this machine. From a Mac in my office, I CANNOT ping any computer on the 10.1.10.X network except the Ubuntu machine I'm talking about. However, under the Shared column in Finder, I can see every server on the other network listed. That makes me believe that somehow this Ubuntu machine is letting certain requests span both networks, which is a security problem. Hope this is enough info.

    Read the article

  • Set Windows 7 Default Login to a Non Domain Account

    - by Joe Taylor
    We have 12 Laptop Pc's that we have upgraded from Windows XP to Windows 7. The laptops are used by staff on away days. They log on to a local account on the machine - say User1 with no password. On the Windows XP Login screen there was a drop down menu allowing them to log on to the Local Machine. However in Windows7 there is no such box and it is confusing staff. Windows 7 tries to log into the domain by default, it doesn't seem to remember where the user last logged into. Is there a way to set Windows7 to log on to the local machine by default instead of the domain? I do not want the staff to have to type for example stafflaptop1\User1 when they log on.

    Read the article

  • Set Windows 7 Default Login to a Non Domain Account

    - by Joe Taylor
    We have 12 Laptop Pc's that we have upgraded from Windows XP to Windows 7. The laptops are used by staff on away days. They log on to a local account on the machine - say User1 with no password. On the Windows XP Login screen there was a drop down menu allowing them to log on to the Local Machine. However in Windows7 there is no such box and it is confusing staff. Windows 7 tries to log into the domain by default, it doesn't seem to remember where the user last logged into. Is there a way to set Windows7 to log on to the local machine by default instead of the domain? I do not want the staff to have to type for example stafflaptop1\User1 when they log on.

    Read the article

  • Upgrade MySQL on Plesk on Windows

    - by Cyril Gupta
    I just got a nasty surprise when I installed a website in Unicode Hindi (Indian language) on a server, all freshly entered unicode data is turning into question marks on the server. On my dev machine it works perfectly. I found that I have MySQL version 5.0.45 (installed in default by Plesk I guess). On my dev machine i have version 5.1.33. I believe the problem could be due to the version difference. The new version of MySQL apparently has better support for Unicode than the older one. I want to upgrade MySQL on my Windows Server machine with Plesk installed on it I am reluctant to just install the new version using the mysql installer because Plesk maintains some custom settings for mysql and I am afraid the new version could change those settings and break my db. Can anyone tell me do I have to do anything special to install MySQL on plesk on windows or can I just use the new version installer?

    Read the article

  • How can I connect a Windows 8 PC to a Samba domain

    - by Paul
    I am using Samba 3, and want to join my Windows 8 PC to the Samba domain. Windows 8 cannot join out of the box, so I added the following registry entries: HKLM\SYSTEM\CurrentControlSet\Services\LanmanWorkstation\Parameters DWORD DomainCompatibilityMode 1 DWORD DNSNameResolutionRequired 0 And now it talks to the Samba server ok, however I get the following error: And I notice that the machine name created on the samba server does not match its name: win-8jq3fg1n74e$:x:30003:30003:Machine:/var/lib/nobody:/bin/false It is like it is using an internal name. The following is the error in the smb.log [2012/10/21 14:26:16.099520, 0] passdb/pdb_interface.c:348(pdb_default_create_user) _samr_create_user: Running the command `/usr/sbin/useradd -c Machine -d /var/lib/nobody -s /bin/false win-8jq3fg1n74e$' gave 9 [2012/10/21 14:26:28.143224, 0] lib/util_sock.c:474(read_fd_with_timeout) [2012/10/21 14:26:28.143420, 0] lib/util_sock.c:1441(get_peer_addr_internal) getpeername failed. Error was Transport endpoint is not connected read_fd_with_timeout: client 0.0.0.0 read error = Connection reset by peer.

    Read the article

  • How to get Windows 7 to automatically connect to an ad-hoc network?

    - by George Edison
    I have two machines - one is running Ubuntu 12.04 64-bit and the other is running Windows 7 Starter Edition (32-bit). The Ubuntu machine is connected to the Internet via the eth0 interface. That machine also has a wireless network interface (wlan0) that is currently functioning as an ad-hoc network. I can connect to the ad-hoc network just fine with the Windows machine but each time I wish to do so, I must manually initiate the connection and enter the password. Is there some way to instruct Windows to automatically connect to this network (an option I have for standard wireless networks but not ad-hoc networks)?

    Read the article

  • Microsoft&rsquo;s new technical computing initiative

    - by Randy Walker
    I made a mental note from earlier in the year.  Microsoft literally buys computers by the truckload.  From what I understand, it’s a typical practice amongst large software vendors.  You plug a few wires in, you test it, and you instantly have mega tera tera flops (don’t hold me to that number).  Microsoft has been trying to plug away at their cloud services (named Azure).  Which, for the layman, means Microsoft runs your software on their computers, and as demand increases you can allocate more computing power on the fly. With this in mind, it doesn’t surprise me that I was recently sent an executive email concerning Microsoft’s new technical computing initiative.  I find it to be a great marketing idea with actual substance behind their real work.  From the programmer academic perspective, in college we dreamed about this type of processing power.  This has decades of computer science theory behind it. A copy of the email received.  (note that I almost deleted this email, thinking it was spam due to it’s length) We don't often think about how complex life really is. Take the relatively simple task of commuting to and from work: it is, in fact, a complicated interplay of variables such as weather, train delays, accidents, traffic patterns, road construction, etc. You can however, take steps to shorten your commute - using a good, predictive understanding of a few of these variables. In fact, you probably are already taking these inputs and instinctively building a predictive model that you act on daily to get to your destination more quickly. Now, when we apply the same method to very complex tasks, this modeling approach becomes much more challenging. Recent world events clearly demonstrated our inability to process vast amounts of information and variables that would have helped to more accurately predict the behavior of global financial markets or the occurrence and impact of a volcano eruption in Iceland. To make sense of issues like these, researchers, engineers and analysts create computer models of the almost infinite number of possible interactions in complex systems. But, they need increasingly more sophisticated computer models to better understand how the world behaves and to make fact-based predictions about the future. And, to do this, it requires a tremendous amount of computing power to process and examine the massive data deluge from cameras, digital sensors and precision instruments of all kinds. This is the key to creating more accurate and realistic models that expose the hidden meaning of data, which gives us the kind of insight we need to solve a myriad of challenges. We have made great strides in our ability to build these kinds of computer models, and yet they are still too difficult, expensive and time consuming to manage. Today, even the most complicated data-rich simulations cannot fully capture all of the intricacies and dependencies of the systems they are trying to model. That is why, across the scientific and engineering world, it is so hard to say with any certainty when or where the next volcano will erupt and what flight patterns it might affect, or to more accurately predict something like a global flu pandemic. So far, we just cannot collect, correlate and compute enough data to create an accurate forecast of the real world. But this is about to change. Innovations in technology are transforming our ability to measure, monitor and model how the world behaves. The implication for scientific research is profound, and it will transform the way we tackle global challenges like health care and climate change. It will also have a huge impact on engineering and business, delivering breakthroughs that could lead to the creation of new products, new businesses and even new industries. Because you are a subscriber to executive e-mails from Microsoft, I want you to be the first to know about a new effort focused specifically on empowering millions of the world's smartest problem solvers. Today, I am happy to introduce Microsoft's Technical Computing initiative. Our goal is to unleash the power of pervasive, accurate, real-time modeling to help people and organizations achieve their objectives and realize their potential. We are bringing together some of the brightest minds in the technical computing community across industry, academia and science at www.modelingtheworld.com to discuss trends, challenges and shared opportunities. New advances provide the foundation for tools and applications that will make technical computing more affordable and accessible where mathematical and computational principles are applied to solve practical problems. One day soon, complicated tasks like building a sophisticated computer model that would typically take a team of advanced software programmers months to build and days to run, will be accomplished in a single afternoon by a scientist, engineer or analyst working at the PC on their desktop. And as technology continues to advance, these models will become more complete and accurate in the way they represent the world. This will speed our ability to test new ideas, improve processes and advance our understanding of systems. Our technical computing initiative reflects the best of Microsoft's heritage. Ever since Bill Gates articulated the then far-fetched vision of "a computer on every desktop" in the early 1980's, Microsoft has been at the forefront of expanding the power and reach of computing to benefit the world. As someone who worked closely with Bill for many years at Microsoft, I am happy to share with you that the passion behind that vision is fully alive at Microsoft and is carried out in the creation of our new Technical Computing group. Enabling more people to make better predictions We have seen the impact of making greater computing power more available firsthand through our investments in high performance computing (HPC) over the past five years. Scientists, engineers and analysts in organizations of all sizes and sectors are finding that using distributed computational power creates societal impact, fuels scientific breakthroughs and delivers competitive advantages. For example, we have seen remarkable results from some of our current customers: Malaria strikes 300,000 to 500,000 people around the world each year. To help in the effort to eradicate malaria worldwide, scientists at Intellectual Ventures use software that simulates how the disease spreads and would respond to prevention and control methods, such as vaccines and the use of bed nets. Technical computing allows researchers to model more detailed parameters for more accurate results and receive those results in less than an hour, rather than waiting a full day. Aerospace engineering firm, a.i. solutions, Inc., needed a more powerful computing platform to keep up with the increasingly complex computational needs of its customers: NASA, the Department of Defense and other government agencies planning space flights. To meet that need, it adopted technical computing. Now, a.i. solutions can produce detailed predictions and analysis of the flight dynamics of a given spacecraft, from optimal launch times and orbit determination to attitude control and navigation, up to eight times faster. This enables them to avoid mistakes in any areas that can cause a space mission to fail and potentially result in the loss of life and millions of dollars. Western & Southern Financial Group faced the challenge of running ever larger and more complex actuarial models as its number of policyholders and products grew and regulatory requirements changed. The company chose an actuarial solution that runs on technical computing technology. The solution is easy for the company's IT staff to manage and adjust to meet business needs. The new solution helps the company reduce modeling time by up to 99 percent - letting the team fine-tune its models for more accurate product pricing and financial projections. Our Technical Computing direction Collaborating closely with partners across industry and academia, we must now extend the reach of technical computing even further to help predictive modelers and data explorers make faster, more accurate predictions. As we build the Technical Computing initiative, we will invest in three core areas: Technical computing to the cloud: Microsoft will play a leading role in bringing technical computing power to scientists, engineers and analysts through the cloud. Existing high- performance computing users will benefit from the ability to augment their on-premises systems with cloud resources that enable 'just-in-time' processing. This platform will help ensure processing resources are available whenever they are needed-reliably, consistently and quickly. Simplify parallel development: Today, computers are shipping with more processing power than ever, including multiple cores, but most modern software only uses a small amount of the available processing power. Parallel programs are extremely difficult to write, test and trouble shoot. However, a consistent model for parallel programming can help more developers unlock the tremendous power in today's modern computers and enable a new generation of technical computing. We are delivering new tools to automate and simplify writing software through parallel processing from the desktop... to the cluster... to the cloud. Develop powerful new technical computing tools and applications: We know scientists, engineers and analysts are pushing common tools (i.e., spreadsheets and databases) to the limits with complex, data-intensive models. They need easy access to more computing power and simplified tools to increase the speed of their work. We are building a platform to do this. Our development efforts will yield new, easy-to-use tools and applications that automate data acquisition, modeling, simulation, visualization, workflow and collaboration. This will allow them to spend more time on their work and less time wrestling with complicated technology. Thinking bigger There is so much left to be discovered and so many questions yet to be answered in the fascinating world around us. We believe the technical computing community will show us that we have not seen anything yet. Imagine just some of the breakthroughs this community could make possible: Better predictions to help improve the understanding of pandemics, contagion and global health trends. Climate change models that predict environmental, economic and human impact, accessible in real-time during key discussions and debates. More accurate prediction of natural disasters and their impact to develop more effective emergency response plans. With an ambitious charter in hand, this new team is ready to build on our progress to-date and execute Microsoft's technical computing vision over the months and years ahead. We will steadily invest in the right technologies, tools and talent, and work to bring together the technical computing community. I invite you to visit www.modelingtheworld.com today. We welcome your ideas and feedback. I look forward to making this journey with you and others who want to answer the world's biggest questions, discover solutions to problems that seem impossible and uncover a host of new opportunities to change the world we live in for the better. Bob

    Read the article

< Previous Page | 135 136 137 138 139 140 141 142 143 144 145 146  | Next Page >