Search Results

Search found 15712 results on 629 pages for 'location href'.

Page 181/629 | < Previous Page | 177 178 179 180 181 182 183 184 185 186 187 188  | Next Page >

  • Is there a navigation app for iPad which re-calculate the route?

    - by earlyadopter
    iPad 3G successfully shows me current location, but google maps are not re-calculating the route if I did not follow exactly initially suggested by it. Normal auto navigators re-calculate on the fly. CoPilot Live HD app I see in the app store has very bad feedback. Do you know any other that are better, please? I need it with maps for the continental U.S., and being able to re-calculate depending on my real current location. I'd be OK even if it won't do that automatically, — I'd tap some button.

    Read the article

  • Windows 7: How to boot up in normal mode after improper shut down?

    - by Level1Coder
    I work in two different locations and whenever there is a power outage at one of the locations, Windows 7 detects that the system was improperly shutdown. Once the power is up, the PC powers on and Windows 7 enters REPAIR/SAFE mode where only someone physically in front of the PC can control it. (Networking is all disabled in this mode) Now before it enters REPAIR/SAFE mode, there is an option for a NORMAL boot. But the catch is that REPAIR/SAFE mode is selected by default with a 30 second timer. Once it automatically enters REPAIR/SAFE mode and if nobody is at the other location, I have no way to remote control it anymore. And then I have to drive over to the other location and reboot it and select boot into NORMAL mode. Where can I change this setting so that Windows 7 always boots into NORMAL mode no matter how many times it is improperly shut down?

    Read the article

  • Setting up Django application on lighttpd behind apache reverse proxy

    - by ml256
    I have a Django app at http://some_other_example.com (it will be behind firewall) running on lighttpd server with fastcgi. I need make it available under http://example.com/myapp. It works fine except for redirects - when I login from http://example.com/myapp/login it redirects me to http://example.com instead of http://example.com/myapp. When logging-in from http://some_other_example.com/login it is ok. My configuration: apache2.conf at example.com: ProxyPass /myapp http://some_other_example.com ProxyPassReverse /myapp http://some_other_example.com ProxyHTMLURLMap http://some_other_example.com /myapp <Location /myapp> SetOutputFilter proxy-html ProxyHTMLExtended On ProxyHTMLURLMap / /myapp/ </Location> in settings.py I added USE_X_FORWARDED_HOST = True but it didn't help

    Read the article

  • Preventing apps to access info from wifi device?

    - by heaosax
    Browsers like Chrome and Firefox can use my wifi device to get information about the surrounded APs and pin point my physical location using Google Location Services, I know these browser always ask for permissions to do this, and that these features can also be "turned off". But I was wondering if there's a better way to prevent ANY application to access this information from my wifi device. I don't like anyone on the internet knowing where I live, and I am worried some software could do the same as these browsers but without asking for permissions. I am using Ubuntu 10.04.

    Read the article

  • re-direct SSL pages using header statement based on port

    - by bob's your brother
    I found this in the header.php file of a e-commerce site. Is this better done in a .htaccess file. Also what would happen to any post parameters that get caught in the header statement. // flip between secure and non-secure pages $uri = $_SERVER['REQUEST_URI']; // move to secure SSL pages if required if (substr($uri,1,12) == "registration") { if($_SERVER['SERVER_PORT'] != 443) { header("HTTP/1.1 301 Moved Permanently"); header("Location: https://".$_SERVER['HTTP_HOST'].$_SERVER['REQUEST_URI']); exit(); } } // otherwise us regular non-SSL pages else { if($_SERVER['SERVER_PORT'] == 443) { header("HTTP/1.1 301 Moved Permanently"); header("Location: http://".$_SERVER['HTTP_HOST'].$_SERVER['REQUEST_URI']); exit(); } }

    Read the article

  • I have to manually enter the pwd when I login at school and when I come back home with my laptop (DELL E6410)

    - by Bong Paduano
    I didn't have to do this the first 3 months of me going to school. My laptop used to be able to connect to my Home WiFi and school Wifi automatically then after the 3rd month something changed and eversince then I literally have to manually enter each of the Wifi passwords in each location each time I try to connect to their respective WLAN. I tried adding each WiFi networks in the "Manage wireless networks" section in the Network and Sharing Center but everytime I login to one of the location, the created network disappears and I did this more than once. I tried restoring the OS back but to no avail. Can someone please assist me on this issue. I'm not sure if this issue has been addressed in this website but I haven't seen any similar question in here. I appreciate any help anyone may be able to share.

    Read the article

  • Two VPN connections from the same IP address

    - by Tayles
    I have set up a server running Windows Server 2008 which two remote users can dial into using a VPN connection. It works fine unless they are both in the same location, in which case only one of them can connect. I understand this is because the PPTP protocol cannot cope with two VPN connections from the same IP address. Is this correct? If so, what can I do about it? Please note that the remote location in question is a serviced office, so we're not in a position to change or play around with their router. Thanks!

    Read the article

  • 2 identical PCs - can I swap a single hard drive between and expect Windows 7/XP to work?

    - by rgvcorley
    I currently work in 2 different locations, traveling between the 2 every few weeks or so. I currently have screens, kb, mouse etc... in both locations, so I just pick up my tower case when I want to move to the other location. However to make moving easier I was thinking of buying 2 tower cases with hot swappable drive bays on the front and installing identical hardware in each one. This would allow me to pull the drives out and just take them with me and plug them into the PC at the other location. Would windows 7 complain? I'm not fussed about buying licenses for both PCs, but would I have any problems with drivers due to the different serial numbers of the components?

    Read the article

  • Recently used contacts getting corrupted Outlook 2010

    - by advis12
    We are using Microsoft's Identity Integration Feature Pack to sync contacts between two separate domains at two separate branches of our company. Recently we have run into an issue with a few of the contacts being pulled/synced from this remote location. When people in our organizaion are sending emails to certain contacts that are being synced from the remote location, the message delivery fails. After looking into the issue we are seeing that this failure is the result of the email being sent to some obscure email address rather than the correct email address. For example, the email address is getting changed to IMCEAEX_O=NT5_ou=44C5AE1BBEE0ED499F607407769311BB_cn=2F31FA0CA187314FA8D6A99BEFC52C36@domain.com rather than [email protected] Clearing the recently used contact list seems to solve the issue, but it does not solve the underlying problem. Because this issue is starting to happen to several different people I would like to solve the underlying cause. Does anyone have any input on how to start troubleshooting this issue? Also, we are using Exchange 2010. Thank you for taking the time to look at this.

    Read the article

  • Video streaming over multi display units

    - by ramdaz
    We have to share video across around 4/8 terminals at a public facility where we need to display live video from within the facility, as well as display messages(advertisements), and also play videos(not live) which need to be controlled centrally from another location. We can do central location handling over Internet, over ssh. What we want to do is connect cameras to a computer, and use the computer to display over multiple display units. We need to do live titling if possible. Once the live local telecast which usually takes about an hour or two a day, we would like to play other videos locally off the PC server. Preferably everything should run off Linux, since budgets are very constrained.... Addendum -- Its not over WAN, it's over a local area. I prefer not using LAN, we would rather use co-axial cable if possible. The reason is if it's LAN, I need some kind of an Networking device, at least a thin client

    Read the article

  • 2 identical PCs - can I swap a single hard drive between and expect Windows 7 to work? [closed]

    - by rgvcorley
    I currently work in 2 different locations, traveling between the 2 every few weeks or so. I currently have screens, kb, mouse etc... in both locations, so I just pick up my tower case when I want to move to the other location. However to make moving easier I was thinking of buying 2 tower cases with hot swappable drive bays on the front and installing identical hardware in each one. This would allow me to pull the drives out and just take them with me and plug them into the PC at the other location. Would windows 7 complain? I'm not fussed about buying licenses for both PCs, but would I have any problems with drivers due to the different serial numbers of the components?

    Read the article

  • Apache not serving pages stored in Subversion repository

    - by Stephen
    I've setup Apache and Subversion on an old PC, but Apache is not serving pages correctly, when I enter the address to my test site: http://HOME_IP_ADDRESS/test/index.html I just get a File Not Found error and the following output in the error log: File does not exist: /var/www/html/svn/repos/test but I know the file exists, when I enter the following URL into the browser: http://HOME_IP_ADDRESS/repos/test/index.html I just get a listing of the HTML. In my Apache config file I have the Document Root set as follows: DocumentRoot "/var/www/html/svn/repos" so I'm not sure what is going on, I have SVN installed and I think it may have something to do this. Edit * I changed the Document Root location, which helped as pages in the new location were served correctly, so the problem is with just serving the pages from the repository.

    Read the article

  • Share iTunes Library from Nas across several clients. How in Windows

    - by Mych
    I found this article http://gigaom.com/apple/one-itunes-library-on-multiple-computers/ which describes sharing a single library with multiple clients. Unfortunately this article is for Macs and I use Windows. The article mentions 3 jobs that need to be completed.' A) - pointing all clients to the location of the Library. This is understandable and I can replicate using windows clients. B) Universal Library Set-up. This mentions holing the option key and double click iTunes icon. This is so you can create the index at a specific location C) Point clients to index. Again this mentions Option double click. What is the Windows equivalent to option double click iTunes?

    Read the article

  • Windows XP: How to boot up in normal mode after improper shut down?

    - by Nick
    I work in two different locations and whenever there is a power outage at one of the locations, Windows XP detects that the system was improperly shutdown. Once the power is up, the PC powers on and Windows XP enters REPAIR/SAFE mode where only someone physically in front of the PC can control it. (Networking is all disabled in this mode) Now before it enters REPAIR/SAFE mode, there is an option for a NORMAL boot. But the catch is that REPAIR/SAFE mode is selected by default with a 30 second timer. Once it automatically enters REPAIR/SAFE mode and if nobody is at the other location, I have no way to remote control it anymore. And then I have to drive over to the other location and reboot it and select boot into NORMAL mode. Where can I change this setting so that Windows XP always boots into NORMAL mode no matter how many times it is improperly shut down?

    Read the article

  • How to lookup a value in a table with multiple criteria

    - by php-b-grader
    I have a data sheet with multiple values in multiple columns. I have a qty and a current price which when multiplied out gives me the current revenue (CurRev). I want to use this lookup table to give me the new revenue (NewRev) from the new price but can't figure out how to do multiple ifs in a lookup. What I want is to build a new column that checks the "Product", "Tier" and "Location/State" and gives me the new price from the lookup table (above) and then multiply that by the qty. e.g. Data > Product, Tier, Location, Qty, CurRev, NewRev > Product1, Tier1, VIC, 2, $1000.00, $6000 (2 x $3000) > Product2, Tier3, NSW, 1, $100.00, $200 (1 x $200) > Product1, Tier3, SA, 5, $250.00, $750 (5 x $150) > Product3, Tier1, ACT, 5, $100.00, $500(5 x $100) > Product2, Tier3, QLD, 2, $150.00, $240 (2 x $240) Worst case, if I just get the new rate I can create another column

    Read the article

  • Nginx common configuration that I might have missed

    - by ApPeL
    I recently moved from Apache Mod_wsgi to Nginx, and I have seen a major improvement on speed a lowering on memory usage and I am generally very happy with the it. I am not a server expert, so please be gentle. I am wondering if there are any small configuration that I might have missed, that will cause me some issues in the long run... Please see my nginx.conf file user nginx nginx; worker_processes 4; error_log /var/log/nginx/error_log info; events { worker_connections 1024; use epoll; } http { include /etc/nginx/mime.types; default_type application/octet-stream; log_format main '$remote_addr - $remote_user [$time_local] ' '"$request" $status $bytes_sent ' '"$http_referer" "$http_user_agent" ' '"$gzip_ratio"'; client_header_timeout 10m; client_body_timeout 10m; send_timeout 10m; connection_pool_size 256; client_header_buffer_size 1k; large_client_header_buffers 4 2k; request_pool_size 4k; gzip on; gzip_min_length 1100; gzip_buffers 4 8k; gzip_types text/plain; output_buffers 1 32k; postpone_output 1460; sendfile on; tcp_nopush on; tcp_nodelay on; keepalive_timeout 75 20; ignore_invalid_headers on; index index.html; server { listen 80; server_name localhost; location /media/ { root /www/django_test1/myapp; # Notice this is the /media folder that we create above } location /mediaadmin/ { alias /opt/python2.6/lib/python2.6/site-packages/django/contrib/admin/media/; # Notice this is the /media folder that we create above } location / { # host and port to fastcgi server fastcgi_pass 127.0.0.1:8080; fastcgi_param SERVER_ADDR $server_addr; fastcgi_param SERVER_PORT $server_port; fastcgi_param SERVER_NAME $server_name; fastcgi_param SERVER_PROTOCOL $server_protocol; fastcgi_param PATH_INFO $fastcgi_script_name; fastcgi_param REQUEST_METHOD $request_method; fastcgi_param QUERY_STRING $query_string; fastcgi_param CONTENT_TYPE $content_type; fastcgi_param CONTENT_LENGTH $content_length; fastcgi_pass_header Authorization; fastcgi_intercept_errors off; client_max_body_size 100M; } access_log /var/log/nginx/localhost.access_log main; error_log /var/log/nginx/localhost.error_log; } }

    Read the article

  • nginx clean url router/rewrites

    - by Janko
    im having difficulties with a relativity simple rewrite rules / router in nginx config. All I want to do is, if requested dir or file 'host/my/request/path[/[index.php]]' does not exist, rewrite to 'host/my/request/path.php' Current rewrite works for: host host/ host/my/request/path But wont work for: host/my/request/path/ Here is the rewrite part of the config: location = /(.*)/ { rewrite ^(.*)$ $1 permanent; } location / { try_files $uri/ $uri $uri.php; } Error log will report: Access forbidden by rule, request: "GET /my/request/path/ HTTP/1.0" Hm, is there a better way to solve this or get rid of the trailing slash? edit, rules more elaborative: host[/] > host/index.php host/index[/] > host/index.php host/my/path[/] > if /path/index.php exists: host/my/path/index.php else host/my/path.php

    Read the article

  • Change windows default path or last used path for saving files? Is that possible?

    - by Mark
    I'm trying to change the default saving path to my specfied path for example if the file is picture (.jpg .gif etc..) and using google chrome I save an image the path can be anything from default or last used so I want it to be mine specfied for example I choose "C:\SaveHerePics" I don't know how you can change it maybe Registry? Dll? I would like to create vb.net program with textbox you choose the new location to be located every time for example "C:\SaveHerePics" and it should not have anything to do with this program even if you close the program the default location for pictures saved via any program or source saved to "C:\SaveHerePics" Thanks a lot for your help!

    Read the article

  • How to setup dns to redirect app.example.com to another ip?

    - by AZ.
    I have a site www.example.com running on a hosting company. Now I want to create a separate web app on my VPS and let it accessible via app.example.com How can I set the DNS to redirect app.example.com to my VPS' ip address? CNAME or A Record? Also, If I want to do a mail server on my VPS too, how to setup the DNS? EDIT: Existing site: www.example.com Location: some hosting company that I don't control. It's running PHP with nginx I guess (or aphache) New site (that I'm working on): app.example.com Location: my VPS, it has an IP address, the VPS is running nodejs. It can run along with nginx but currently it's not. I want the existing website continue working (as customer visit www.example.com) and I want customer to visit app.example.com for some new features. The two websites are NOT on the same server and not using the same IP address.

    Read the article

  • Encryption container for multiple people

    - by Adam M.
    I was just wondering if anyone may have come across a product that would allow for a container based encryption to be used by multiple people, in a Windows Server setup. I wanted to see if there might be something like a truecrypt that could handle being accessed by two accounts? Looking to see if there is a product that would have such properties that would allow only a hand full of users access to the content of the location, but allow for the files to be backed up a normal backup system. That way if a file had to be restored, the container could be redirected to another location for one of the users to get access to it? This would allow for access to be restricted beyond the NTFS and file share permissons

    Read the article

  • How to stop Nginx sending static file requests to the CakePHP app controller when running Cake in a subdirectory?

    - by robotmay
    I'm trying to run a CakePHP app from within a subfolder on Nginx, but the static files are not being found and are instead being passed to the app controller. Here's my current config: location /uniquetv { index index.php index.html; if (-f $request_filename) { break; } if (!-e $request_filename) { rewrite ^/uniquetv(.+)$ /uniquetv/webroot/$1 last; break; } } location /uniquetv/webroot { index index.php; if (!-e $request_filename) { rewrite ^/uniquetv/webroot/(.+)$ /uniquetv/webroot/index.php?url=$1 last; break; } } Any ideas? :)

    Read the article

  • How to disable multiple form submit (POST) in IIS

    - by user1209640
    We had a major SharePoint outage a few months back because a user wedged their keyboard in such a way as to cause the Enter button to be pressed indefinitely. The user was on a customized people search page and hundreds of POSTs by the same user were submitted asynchronously, which overloaded the server. Because I work in a large organization, I am looking for a more global way to prevent this from happening. Is there a way to prevent multiple web form submissions by a common user within a short period of time within IIS? I am aware we can write javascript to disable the button after it is clicked, but we are hoping to prevent this issue from occurring on other pages where a similar possibility may exist. Update: It appears looking at the source code, the javascript is performing a document.location = url, whenever keycode 13 (Enter) is pressed. Again, we can write JS to prevent this in this location, but we also want to be able to guard against this kind of issue more generally... preferably at the IIS level.

    Read the article

  • Unable to use strong encryption

    - by user224299
    So I am exploring Apache to create a simple example: the default page and a directory "secure". I everyone to be able to access the server but, when one wants to access the "secure" directory, I the connection to use strong encryption. I am using apache2.4. However this is not working and I don't know why! I have done just like in the Apache tutorial: LoadModule ssl_module /usr/lib/apache2/modules/mod_ssl.so <VirtualHost *:443> SSLEngine on SSLCertificateFile /home/vitorpereira/Desktop/cert.cer SSLCertificateKeyFile /home/vitorpereira/Desktop/key.key </VirtualHost> SSLCipherSuite ALL:!ADH:RC4+RSA:+HIGH:+MEDIUM:+LOW:+SSLv2:+EXP:+eNULL <Location /var/www/html/secure> SSLCipherSuite HIGH:!aNULL:!MD5 </Location> But this does not work :/ And, I can access the secure folder with http but when I write https, it says not found!

    Read the article

  • PowerShell Script to Deploy Multiple VM on Azure in Parallel #azure #powershell

    - by Marco Russo (SQLBI)
    This blog is usually dedicated to Business Intelligence and SQL Server, but I didn’t found easily on the web simple PowerShell scripts to help me deploying a number of virtual machines on Azure that I use for testing and development. Since I need to deploy, start, stop and remove many virtual machines created from a common image I created (you know, Tabular is not part of the standard images provided by Microsoft…), I wanted to minimize the time required to execute every operation from my Windows Azure PowerShell console (but I suggest you using Windows PowerShell ISE), so I also wanted to fire the commands as soon as possible in parallel, without losing the result in the console. In order to execute multiple commands in parallel, I used the Start-Job cmdlet, and using Get-Job and Receive-Job I wait for job completion and display the messages generated during background command execution. This technique allows me to reduce execution time when I have to deploy, start, stop or remove virtual machines. Please note that a few operations on Azure acquire an exclusive lock and cannot be really executed in parallel, but only one part of their execution time is subject to this lock. Thus, you obtain a better response time also in these scenarios (this is the case of the provisioning of a new VM). Finally, when you remove the VMs you still have the disk containing the virtual machine to remove. This cannot be done just after the VM removal, because you have to wait that the removal operation is completed on Azure. So I wrote a script that you have to run a few minutes after VMs removal and delete disks (and VHD) no longer related to a VM. I just check that the disk were associated to the original image name used to provision the VMs (so I don’t remove other disks deployed by other batches that I might want to preserve). These examples are specific for my scenario, if you need more complex configurations you have to change and adapt the code. But if your need is to create multiple instances of the same VM running in a workgroup, these scripts should be good enough. I prepared the following PowerShell scripts: ProvisionVMs: Provision many VMs in parallel starting from the same image. It creates one service for each VM. RemoveVMs: Remove all the VMs in parallel – it also remove the service created for the VM StartVMs: Starts all the VMs in parallel StopVMs: Stops all the VMs in parallel RemoveOrphanDisks: Remove all the disks no longer used by any VMs. Run this script a few minutes after RemoveVMs script. ProvisionVMs # Name of subscription $SubscriptionName = "Copy the SubscriptionName property you get from Get-AzureSubscription"   # Name of storage account (where VMs will be deployed) $StorageAccount = "Copy the Label property you get from Get-AzureStorageAccount"   function ProvisionVM( [string]$VmName ) {     Start-Job -ArgumentList $VmName {         param($VmName) $Location = "Copy the Location property you get from Get-AzureStorageAccount" $InstanceSize = "A5" # You can use any other instance, such as Large, A6, and so on $AdminUsername = "UserName" # Write the name of the administrator account in the new VM $Password = "Password"      # Write the password of the administrator account in the new VM $Image = "Copy the ImageName property you get from Get-AzureVMImage" # You can list your own images using the following command: # Get-AzureVMImage | Where-Object {$_.PublisherName -eq "User" }         New-AzureVMConfig -Name $VmName -ImageName $Image -InstanceSize $InstanceSize |             Add-AzureProvisioningConfig -Windows -Password $Password -AdminUsername $AdminUsername|             New-AzureVM -Location $Location -ServiceName "$VmName" -Verbose     } }   # Set the proper storage - you might remove this line if you have only one storage in the subscription Set-AzureSubscription -SubscriptionName $SubscriptionName -CurrentStorageAccount $StorageAccount   # Select the subscription - this line is fundamental if you have access to multiple subscription # You might remove this line if you have only one subscription Select-AzureSubscription -SubscriptionName $SubscriptionName   # Every line in the following list provisions one VM using the name specified in the argument # You can change the number of lines - use a unique name for every VM - don't reuse names # already used in other VMs already deployed ProvisionVM "test10" ProvisionVM "test11" ProvisionVM "test12" ProvisionVM "test13" ProvisionVM "test14" ProvisionVM "test15" ProvisionVM "test16" ProvisionVM "test17" ProvisionVM "test18" ProvisionVM "test19" ProvisionVM "test20"   # Wait for all to complete While (Get-Job -State "Running") {     Get-Job -State "Completed" | Receive-Job     Start-Sleep 1 }   # Display output from all jobs Get-Job | Receive-Job   # Cleanup of jobs Remove-Job *   # Displays batch completed echo "Provisioning VM Completed" RemoveVMs # Name of subscription $SubscriptionName = "Copy the SubscriptionName property you get from Get-AzureSubscription"   function RemoveVM( [string]$VmName ) {     Start-Job -ArgumentList $VmName {         param($VmName)         Remove-AzureService -ServiceName $VmName -Force -Verbose     } }   # Select the subscription - this line is fundamental if you have access to multiple subscription # You might remove this line if you have only one subscription Select-AzureSubscription -SubscriptionName $SubscriptionName   # Every line in the following list remove one VM using the name specified in the argument # You can change the number of lines - use a unique name for every VM - don't reuse names # already used in other VMs already deployed RemoveVM "test10" RemoveVM "test11" RemoveVM "test12" RemoveVM "test13" RemoveVM "test14" RemoveVM "test15" RemoveVM "test16" RemoveVM "test17" RemoveVM "test18" RemoveVM "test19" RemoveVM "test20"   # Wait for all to complete While (Get-Job -State "Running") {     Get-Job -State "Completed" | Receive-Job     Start-Sleep 1 }   # Display output from all jobs Get-Job | Receive-Job   # Cleanup Remove-Job *   # Displays batch completed echo "Remove VM Completed" StartVMs # Name of subscription $SubscriptionName = "Copy the SubscriptionName property you get from Get-AzureSubscription"   function StartVM( [string]$VmName ) {     Start-Job -ArgumentList $VmName {         param($VmName)         Start-AzureVM -Name $VmName -ServiceName $VmName -Verbose     } }   # Select the subscription - this line is fundamental if you have access to multiple subscription # You might remove this line if you have only one subscription Select-AzureSubscription -SubscriptionName $SubscriptionName   # Every line in the following list starts one VM using the name specified in the argument # You can change the number of lines - use a unique name for every VM - don't reuse names # already used in other VMs already deployed StartVM "test10" StartVM "test11" StartVM "test11" StartVM "test12" StartVM "test13" StartVM "test14" StartVM "test15" StartVM "test16" StartVM "test17" StartVM "test18" StartVM "test19" StartVM "test20"   # Wait for all to complete While (Get-Job -State "Running") {     Get-Job -State "Completed" | Receive-Job     Start-Sleep 1 }   # Display output from all jobs Get-Job | Receive-Job   # Cleanup Remove-Job *   # Displays batch completed echo "Start VM Completed"   StopVMs # Name of subscription $SubscriptionName = "Copy the SubscriptionName property you get from Get-AzureSubscription"   function StopVM( [string]$VmName ) {     Start-Job -ArgumentList $VmName {         param($VmName)         Stop-AzureVM -Name $VmName -ServiceName $VmName -Verbose -Force     } }   # Select the subscription - this line is fundamental if you have access to multiple subscription # You might remove this line if you have only one subscription Select-AzureSubscription -SubscriptionName $SubscriptionName   # Every line in the following list stops one VM using the name specified in the argument # You can change the number of lines - use a unique name for every VM - don't reuse names # already used in other VMs already deployed StopVM "test10" StopVM "test11" StopVM "test12" StopVM "test13" StopVM "test14" StopVM "test15" StopVM "test16" StopVM "test17" StopVM "test18" StopVM "test19" StopVM "test20"   # Wait for all to complete While (Get-Job -State "Running") {     Get-Job -State "Completed" | Receive-Job     Start-Sleep 1 }   # Display output from all jobs Get-Job | Receive-Job   # Cleanup Remove-Job *   # Displays batch completed echo "Stop VM Completed" RemoveOrphanDisks $Image = "Copy the ImageName property you get from Get-AzureVMImage" # You can list your own images using the following command: # Get-AzureVMImage | Where-Object {$_.PublisherName -eq "User" }   # Remove all orphan disks coming from the image specified in $ImageName Get-AzureDisk |     Where-Object {$_.attachedto -eq $null -and $_.SourceImageName -eq $ImageName} |     Remove-AzureDisk -DeleteVHD -Verbose  

    Read the article

< Previous Page | 177 178 179 180 181 182 183 184 185 186 187 188  | Next Page >