Search Results

Search found 1826 results on 74 pages for 'powershell smo'.

Page 31/74 | < Previous Page | 27 28 29 30 31 32 33 34 35 36 37 38  | Next Page >

  • AZURE - Stairway To Heaven

    - by Waclaw Chrabaszcz
    Originally posted on: http://geekswithblogs.net/Wchrabaszcz/archive/2014/08/02/azure---stairway-to-heaven.aspx  Before you’ll start reading please start to play this song.   OK boys and girls, time get familiar with clouds. Time to become a meteorologist. To be honest I don’t know how to start. Is cloud better or worse than on campus resources … hmm … it is just different. I think for successful adoption in cloud world IT Dinosaurs need to forget some “Private Cloud” virtualization bad habits, and learn new way of thinking. Take a look: - I don’t need any  tapes or  CDs  (Physical Kingdom of Windows XP and 2000) - I don’t need any locally stored MP3s (CD virtualization :-) - I can just stream music to your computer no matter whether my on-site infrastructure is powered on. Why not to do exactly the same with WebServer, SQL, or just rented for a while Windows server ? Let’s go, to the other side of the mirror. 1st  - register yourself for free one month trial, as happy MSDN subscriber you’ve got monthly budget to spent. In addition in default setting your limit protects you against loosing real money, if your toys will consume too much traffic and space. http://azure.microsoft.com/en-us/pricing/free-trial/ Once your account is ready forget WebPortal, we are PowerShell knights. http://go.microsoft.com/?linkid=9811175&clcid=0x409 #Authenticate yourself in Azure Add-AzureAccount #download once your settings file Get-AzurePublishSettingsFile #Import it to your PowerShell Module Import-AzurePublishSettingsFile "C:\Azure\[filename].publishsettings" #validation Get-AzureAccount Get-AzureSubscription #where are Azure datacenters Get-AzureLocation #You will need it Update-Help #storage account is related to physical location, there are two datacenters on each continent, try nearest to you # all your VMs will store VHD files on your storage account #your storage account must be unique globally, so I assume that words account or server are already used New-AzureStorageAccount -StorageAccountName "[YOUR_STORAGE_ACCOUNT]" -Label "AzureTwo" -Location "West Europe" Get-AzureStorageAccount #it looks like you are ready to deploy first VM, what templates we can use Get-AzureVMImage | Select ImageName #what a mess, let’s choose Server 2012 $ImageName = (Get-AzureVMImage)[74].ImageName $cloudSvcName = '[YOUR_STORAGE_ACCOUNT]' $AdminUsername = "[YOUR-ADMIN]" $adminPassword = '[YOUR_PA$$W0RD]' $MediaLocation = "West Europe" $vmnameDC = 'DC01' #burn baby burn !!! $vmDC01 = New-AzureVMConfig -Name $vmnameDC -InstanceSize "Small" -ImageName $ImageName   `     | Add-AzureProvisioningConfig -Windows -Password $adminPassword -AdminUsername $AdminUsername   `     | New-AzureVM -ServiceName $cloudSvcName #ice, ice baby … Get-AzureVM Get-AzureRemoteDesktopFile -ServiceName "[YOUR_STORAGE_ACCOUNT]" -Name "DC01" -LocalPath "c:\AZURE\DC01.rdp" As you can see it is not just a new-VM, you need to associate your VM with AzureVMConfig (it sets your template), AzureProvisioningConfig (it sets your customizations), and Storage account. In next releases you’ll need to put this machine in specific subnet, attach a HDD and many more. After second reading I found that I am using the same name for STORAGE and SERVICE account, please be aware of it if you need to split these values. Conclusions: - pipe rules ! - at the beginning it is hard to change your mind and agree with fact that it is easier to remove and recreate a VM than move it to different subnet - by default everything is firewalled, limited access to DNS, but NATed outside on custom ports. It is good to check these translations sometimes on the webportal. - if you remove your VMs your harddrives remains on storage and MS will charge you . Remove-AzureVM -DeleteVHD For me AZURE it is a lot of fun, once again I can be newbie and learn every page. For me Azure offers real freedom in deployment of VMs without arguing with NetAdmins, WinAdmins, DBAs, PMs and other Change Managers. Unfortunately soon or later they will come to my haven and change it into …

    Read the article

  • How to Automate your Database Documentation

    - by Jonathan Hickford
    In my previous post, “Automating Deployments with SQL Compare command line” I looked at how teams can automate the deployment and post deployment validation of SQL Server databases using the command line versions of Red Gate tools. In this post I’m looking at another use for the command line tools, namely using them to generate up-to-date documentation with every database change. There are many reasons why up-to-date documentation is valuable. For example when somebody new has to work on or administer a database for the first time, or when a new database comes into service. Having database documentation reduces the risks of making incorrect decisions when making changes. Documentation is very useful to business intelligence analysts when writing reports, for example in SSRS. There are a couple of great examples talking about why up to date documentation is valuable on this site:  Database Documentation – Lands of Trolls: Why and How? and Database Documentation Using SQL Doc. The short answer is that it can save you time and reduce risk when you need that most! SQL Doc is a fast simple tool that automatically generates database documentation. It can create documents in HTML, Word or pdf files. The documentation contains information about object definitions and dependencies, along with any other information you want to associate with each object. The SQL Doc GUI, which is included in Red Gate’s SQL Developer Bundle and SQL Toolbelt, allows you to add additional notes to objects, and customise which objects are shown in the docs.  These settings can be saved as a .sqldoc project file. The SQL Doc command line can use this project file to automatically update the documentation every time the database is changed, ensuring that documentation that is always up to date. The simplest way to keep documentation up to date is probably to use a scheduled task to run a script every day. However if you have a source controlled database, or are using a Continuous Integration (CI) server or a build server, it may make more sense to use that instead. If  you’re using SQL Source Control or SSDT Database Projects to help version control your database, you can automatically update the documentation after each change is made to the source control repository that contains your database. To get this automation in place,  you can use the functionality of a Continuous Integration (CI) server, which can trigger commands to run when a source control repository has changed. A CI server will also capture and save the documentation that is created as an artifact, so you can always find the exact documentation for a specific version of the database. This forms an always up to date data dictionary. If you don’t already have a CI server in place there are several you can use, such as the free open source Jenkins or the free starter editions of TeamCity. I won’t cover setting these up in this article, but there is information about using CI servers for automating database tasks on the Red Gate Database Delivery webpage. You may be interested in Red Gate’s SQL CI utility (part of the SQL Automation Pack) which is an easy way to update a database with the latest changes from source control. The PowerShell example below shows how to create the documentation from a database. That database might be your integration database or a shared development database that is always up to date with the latest changes. $serverName = "server\instance" $databaseName = "databaseName" # If you want to document multiple databases use a comma separated list $userName = "username" $password = "password" # Path to SQLDoc.exe $SQLDocPath = "C:\Program Files (x86)\Red Gate\SQL Doc 3\SQLDoc.exe" $arguments = @( "/server:$($serverName)", "/database:$($databaseName)", "/username:$($userName)", "/password:$($password)", "/filetype:html", "/outputfolder:.", # "/project:$args[0]", # If you already have a .sqldoc project file you can pass it as an argument to this script. Values in the project will be overridden with any options set on the command line "/name:$databaseName Report", "/copyrightauthor:$([Environment]::UserName)" ) write-host $arguments & $SQLDocPath $arguments There are several options you can set on the command line to vary how your documentation is created. For example, you can document multiple databases or exclude certain types of objects. In the example above, we set the name of the report to match the database name, and use the current Windows user as the documentation author. For more examples of how you can customise the report from the command line please see the SQL Doc command line documentation If you already have a .sqldoc project file, or wish to further customise the report by including or excluding specific objects, you can use this project on the command line. Any settings you specify on the command line will override the defaults in the project. For details of what you can customise in the project please see the SQL Doc project documentation. In the example above, the line to use a project is commented out, but you can uncomment this line and then pass a path to a .sqldoc project file as an argument to this script.  Conclusion Keeping documentation about your databases up to date is very easy to set up using SQL Doc and PowerShell. By using a CI server to run this process you can trigger the documentation to be run on every change to a source controlled database, and keep historic documentation available. If you are considering more advanced database automation, e.g. database unit testing, change script generation, deploying to large numbers of targets and backup/verification, please email me at [email protected] for further script samples or if you have any questions.

    Read the article

  • PowerShell &ndash; Recycle All IIS App Pools

    - by Lance Robinson
    With a little help from Shay Levy’s post on Stack Overflow and the MSDN documentation, I added this handy function to my profile to automatically recycle all IIS app pools.           function Recycle-AppPools {     param(     [string] $server = "3bhs001",     [int] $mode = 1, # ManagedPipelineModes: 0 = integrated, 1 = classic     )  $iis = [adsi]"IIS://$server/W3SVC/AppPools" $iis.psbase.children | %{ $pool = [adsi]($_.psbase.path);    if ($pool.AppPoolState -eq 2 -and $pool.ManagedPipelineMode -eq $mode) {    # AppPoolStates:  1 = starting, 2 = started, 3 = stopping, 4 = stopped               $pool.psbase.invoke("recycle")      }   }}

    Read the article

  • Powershell: If statements dependent on installed exchange role

    - by marc dekeyser
    Something I need to keep for usage in the future:$hostname=hostnameIf (get-exchangeserver $hostname | where {$_.isClientAccessServer -eq $true})    {    } else {    }    If (get-exchangeserver $hostname | where {$_.isHubTransportServer -eq $true})    {    } else {    }If (get-exchangeserver $hostname | where {$_.isMailboxServer -eq $true})    {    } else {    }If (get-exchangeserver $hostname | where {$_.isUnifiedMessagingServer -eq $true})    {    } else {    }If (get-exchangeserver $hostname | where {$_.isEdgeServer -eq $true})    {    } else {    }

    Read the article

  • T-SQL Tuesday #15 : Running T-SQL workloads remotely on multiple servers

    - by AaronBertrand
    This month's installment of T-SQL Tuesday is hosted by Pat Wright ( blog | twitter ). Pat says: "So the topic I have chosen for this month is Automation! It can be Automation with T-SQL or with Powershell or a mix of both. Give us your best tips/tricks and ideas for making our lives easier through Automation." In a recent project, we've had a need to run concurrent workloads on as many as 100 instances of SQL Server in a test environment. A goal, obviously, is to accomplish this without having to...(read more)

    Read the article

  • Powershell: Get-Framework-Versions.

    - by marc dekeyser
    This function will use the test-key function posted earlier. It will check which .NET frameworks are installed (currently only checking for .NET 4.0) but can be easily adapted and/or expanded. function Get-Framework-Versions(){    $installedFrameworks = @()    if(Test-Key "HKLM:\Software\Microsoft\NET Framework Setup\NDP\v4\Client" "Install") { $installedFrameworks += "4.0c" }    if(Test-Key "HKLM:\Software\Microsoft\NET Framework Setup\NDP\v4\Full" "Install") { $installedFrameworks += "4.0" }            return $installedFrameworks}

    Read the article

  • T-SQL Tuesday #15 : Running T-SQL workloads remotely on multiple servers

    - by AaronBertrand
    This month's installment of T-SQL Tuesday is hosted by Pat Wright ( blog | twitter ). Pat says: "So the topic I have chosen for this month is Automation! It can be Automation with T-SQL or with Powershell or a mix of both. Give us your best tips/tricks and ideas for making our lives easier through Automation." In a project we are working on, we've had a need to run concurrent workloads on as many as 100 instances of SQL Server in a test environment. A goal, obviously, is to accomplish this without...(read more)

    Read the article

  • Check If a SQL Server Database Is In Pseudo-Simple Recovery Model Using Windows PowerShell

    Check if databases are really in FULL recovery model with a recovery model called pseudo-simple, where the database still behaves like it is still in SIMPLE recovery model until a full database backup is taken. Get your SQL Server database under version control now!Version control is standard for applications, but databases haven’t caught up. So how can you bring database development up to speed? Why should you start? Find out…

    Read the article

  • Create a Monitoring Server for SQL Server with PowerShell

    At some point you are going to need a notification system for a range of events that occur in your servers. Laerte Junior shows how you can even set up temporary or permanent alerts for any WMI events to give you a system that fits your server environment perfectly. Get Smart with SQL Backup Pro Powerful centralised management, encryption and more.SQL Backup Pro was the smartest kid at school Discover why.

    Read the article

  • How is Pash licensed?

    - by Jay Bazuzi
    Pash is an open source reimplementation of Windows PowerShell. It was released in 2008, and has been idle since then. I would like to take up the mantle. It's not clear what the license is. There is no LICENSE file or license details in the code. The only reference to a license I can find is on this page: http://sourceforge.net/projects/pash/ where it says: "License: BSD License, GNU General Public License (GPL)" But I'm not sure if I can take that as authoritative. I have tried to contact the author but he has not responded. I would hate to proceed with this project and later discover that I am violating a license, and have the project crippled as a result.

    Read the article

  • Check SQL Server Virtual Log Files Using PowerShell

    In a previous tip on Monitor Your SQL Server Virtual Log Files with Policy Based Management, we have seen how we can use Policy Based Management to monitor the number of virtual log files (VLFs) in our SQL Server databases. However, even with that most of the solutions I see online involve the creation of temporary tables and/or a combination of using cursors to get the total number of VLFs in a transaction log file. Is there a much easier solution?

    Read the article

  • Converting a PowerShell Script into a Module Part 2

    In this article the author explains how PSModuleInfo object for a module can be retrieved. Further, he shows how code can be injected into the module to manipulate the state of a module without having to reload it. He also explains how to directly set some metadata elements, like the module description, and some other PSModuleInfo object features.

    Read the article

  • Using SQL Alchemy and pyodbc with IronPython 2.6.1

    - by beargle
    I'm using IronPython and the clr module to retrieve SQL Server information via SMO. I'd like to retrieve/store this data in a SQL Server database using SQL Alchemy, but am having some trouble loading the pyodbc module. Here's the setup: IronPython 2.6.1 (installed at D:\Program Files\IronPython) CPython 2.6.5 (installed at D:\Python26) SQL Alchemy 0.6.1 (installed at D:\Python26\Lib\site-packages\sqlalchemy) pyodbc 2.1.7 (installed at D:\Python26\Lib\site-packages) I have these entries in the IronPython site.py to import CPython standard and third-party libraries: # Add CPython standard libs and DLLs import sys sys.path.append(r"D:\Python26\Lib") sys.path.append(r"D:\Python26\DLLs") sys.path.append(r"D:\Python26\lib-tk") sys.path.append(r"D:\Python26") # Add CPython third-party libs sys.path.append(r"D:\Python26\Lib\site-packages") # sqlite3 sys.path.append(r"D:\Python26\Lib\sqlite3") # Add SQL Server SMO sys.path.append(r"D:\Program Files\Microsoft SQL Server\100\SDK\Assemblies") import clr clr.AddReferenceToFile('Microsoft.SqlServer.Smo.dll') clr.AddReferenceToFile('Microsoft.SqlServer.SqlEnum.dll') clr.AddReferenceToFile('Microsoft.SqlServer.ConnectionInfo.dll') SQL Alchemy imports OK in IronPython, put I receive this error message when trying to connect to SQL Server: IronPython 2.6.1 (2.6.10920.0) on .NET 2.0.50727.3607 Type "help", "copyright", "credits" or "license" for more information. >>> import sqlalchemy >>> e = sqlalchemy.MetaData("mssql://") Traceback (most recent call last): File "<stdin>", line 1, in <module> File "D:\Python26\Lib\site-packages\sqlalchemy\schema.py", line 1780, in __init__ File "D:\Python26\Lib\site-packages\sqlalchemy\schema.py", line 1828, in _bind_to File "D:\Python26\Lib\site-packages\sqlalchemy\engine\__init__.py", line 241, in create_engine File "D:\Python26\Lib\site-packages\sqlalchemy\engine\strategies.py", line 60, in create File "D:\Python26\Lib\site-packages\sqlalchemy\connectors\pyodbc.py", line 29, in dbapi ImportError: No module named pyodbc This code works just fine in CPython, but it looks like the pyodbc module isn't accessible from IronPython. Any suggestions? I realize that this may not be the best way to approach the problem, so I'm open to tackling this a different way. Just wanted to get some experience with using SQL Alchemy and pyodbc.

    Read the article

  • How do you increase the number of processes in parallel with Powershell 3?

    - by Mark Shay
    I am trying to run 20 processes in parallel. I changed the session as below, but having no luck. I am getting only up to 5 parallel processes per session. $wo=New-PSWorkflowExecutionOption -MaxSessionsPerWorkflow 50 -MaxDisconnectedSessions 200 -MaxSessionsPerRemoteNode 50 -MaxActivityProcesses 50 Register-PSSessionConfiguration -Name ITWorkflows -SessionTypeOption $wo -Force Get-PSSessionConfiguration ITWorkflows | Format-List -Property * Is there a switch parameter to increase the number of processes? This is what I am running: Workflow MyWorkflow1 { Parallel { InlineScript { import-module \\PS_Scripts\bulkins.ps1; BulkIns "where OrderId between 2 and 2975416"} InlineScript { import-module \\PS_Scripts\bulkins.ps1; BulkIns "where OrderId between 2975417 and 5950831"} InlineScript { import-module \\PS_Scripts\bulkins.ps1; BulkIns "where OrderId between 5950832 and 8926246"} InlineScript { import-module \\PS_Scripts\bulkins.ps1; BulkIns "where OrderId between 8926247 and 11901661"} InlineScript { import-module \\PS_Scripts\bulkins.ps1; BulkIns "where OrderId between 11901662 and 14877076"} InlineScript { import-module \\PS_Scripts\bulkins.ps1; BulkIns"where OrderId between 14877077 and 17852491"} InlineScript { import-module \\PS_Scripts\bulkins.ps1; BulkIns "where OrderId between 17852492 and 20827906"} InlineScript { import-module \\PS_Scripts\bulkins.ps1; BulkIns "where OrderId between 20827907 and 23803321"} InlineScript { import-module \\PS_Scripts\bulkins.ps1; BulkIns "where OrderId between 23803322 and 26778736"} InlineScript { import-module \\PS_Scripts\bulkins.ps1; BulkIns "where OrderId between 26778737 and 29754151"} InlineScript { import-module \\PS_Scripts\bulkins.ps1; BulkIns "where OrderId between 29754152 and 32729566"} InlineScript { import-module \\PS_Scripts\bulkins.ps1; BulkIns "where OrderId between 32729567 and 35704981"} InlineScript { import-module \\PS_Scripts\bulkins.ps1; BulkIns "where OrderId between 35704982 and 38680396"} InlineScript { import-module \\PS_Scripts\bulkins.ps1; BulkIns "where OrderId between 38680397 and 432472144"} InlineScript { import-module \\PS_Scripts\bulkins.ps1; BulkIns "where OrderId between 432472145 and 435447559"} InlineScript { import-module \\PS_Scripts\bulkins.ps1; BulkIns "where OrderId between 435447560 and 438422974"} InlineScript { import-module \\PS_Scripts\bulkins.ps1; BulkIns "where OrderId between 864944289 and 867919703"} InlineScript { import-module \\PS_Scripts\bulkins.ps1; BulkIns "where OrderId between 867919704 and 870895118"} InlineScript { import-module \\PS_Scripts\bulkins.ps1; BulkIns "where OrderId between 870895119 and 1291465602"} InlineScript { import-module \\PS_Scripts\bulkins.ps1; BulkIns "where OrderId between 1291465603 and 1717986945"} }

    Read the article

  • dos batch assign returned values from a command into a variable (from powershell)

    - by nokheat
    i am refering to this question ASSIGN win XP dos commandline output to variable http://stackoverflow.com/questions/537404/assign-win-xp-dos-commandline-output-to-variable i am trying to use it on a powershell code segment so i typed powershell date (get-date).AddDays(-1) -format yyyyMMdd and confirm it returns like 20100601 but then if i tried to for /f "tokens=*" %a in ('powershell date get-date -format yyyyMMdd ') do set var=%a then it failed to work as expected. how can i transfer the date to a variable?

    Read the article

  • MSBuild / PowerShell: Copy SQL Server 2012 database to SQL Azure via BACPAC (for Continuous Integration)

    - by giveme5minutes
    I'm creating a continuous integration MSBuild script which copies a database in on-premise SQL Server 2012 to SQL Azure. Easy right? Methods After a fair bit of research I've come across the following methods: Use PowerShell to access the DAC library directly, then use the MSBuild PowerShell extension to wrap the script. This would require installing PowerShell 3 and working out how to make the MSBuild PowerShell extension work with it, as apparently MS moved the DAC API to a different namespace in the latest version of the library. PowerShell would give direct access to the API, but may require quite a bit of boilerplate. Use the sample DAC Framework Client Side Tools, which requires compiling them myself, as the downloads available from Codeplex only include the Hosted version. It would also require fixing them to use DAC 3.0 classes as they appear to currently use an earlier version of DAC. I could then call these tools from an <Exec Command="" /> in the MSBuild script. Less boilerplate and if I hit any bumps in the road I can just make changes to the source. Processes Using whichever method, the process could be either: Export from on-premise SQL Server 2012 to local BACPAC Upload BACPAC to blog storage Import BACPAC to SQL Azure via Hosted DAC Or: Export from on-premise SQL Server 2012 to local BACPAC Import BACPAC to SQL Azure via Client DAC Question All of the above seems to be quite a lot of effort for something that seems to be a standard feature... so before I start reinventing the wheel and documenting the results for all to see, is there something really obvious that I've missed here? Is there pre-written script that MS has released that I have not yet uncovered? There's an command in the GUI of SQL Server Management Studio 2012 that does EXACTLY what I'm trying to do (right click on local database, click "Tasks", click "Deploy Database to SQL Azure"). Surely if it's a few clicks in the GUI it must be a single command on the command line somewhere??

    Read the article

  • How do I write a powershell script that gets the file with the most recent last write time from a fo

    - by Shoko
    The subject line says it all. I'd also like to do this using pipes. I figured that I could use Get-ChildItem, Measure-Object and Where-Object, but Measure-Object doesn't like dates. Should I have a script block which loops through each item returned from Get-ChildItem and does a comparison to see if it's the most recent? I thought that there should be a handy PS cmdlet for that.

    Read the article

  • Any SMO Library for .NET?

    - by gtas
    Anyone knows any library for SQL Server database backup and restore for .NET? This actually is needed to avoid writing one new. There is so many people out there giving their libraries, but i found it strange i couldn't came up with googling anything related.

    Read the article

  • Any SMO C# Library?

    - by gtas
    Anyone knows any library for SQL Server database backup and restore for C#? This actually is needed to avoid writing one new. There is so many people out there giveng they're libraries, but i found strange i couldn't came up with googling anything related. Thank you.

    Read the article

  • Any way to specify a non-positional parameter in a powershell script?

    - by Julian Birch
    I've got the following at the start of a script Param( [string]$command, [string]$version = "1.1.0" ) This is fine, only I need for $version to not be a positional parameter, so that if you type .\script.ps1 run argument Then $args should contain argument and $version should be 1.1.0. Is this even possible? I know I can do it with a c# cmdlet, but it would be massively more convenient if I could deliver this as a single script.

    Read the article

  • get-wmiobject sql join in powershell - trying to find physical memory vs. virtual memory of remote s

    - by Willy
    get-wmiobject -query "Select TotalPhysicalMemory from Win32_LogicalMemoryConfiguration" -computer COMPUTERNAME output.csv get-wmiobject -query "Select TotalPageFileSpace from Win32_LogicalMemoryConfiguration" -computer COMPUTERNAME output.csv I am trying to complete this script with an output as such: Computer Physical Memory Virtual Memory server1 4096mb 8000mb server2 2048mb 4000mb

    Read the article

< Previous Page | 27 28 29 30 31 32 33 34 35 36 37 38  | Next Page >