Search Results

Search found 1826 results on 74 pages for 'powershell smo'.

Page 40/74 | < Previous Page | 36 37 38 39 40 41 42 43 44 45 46 47  | Next Page >

  • Exchange Online - granting admin rights on a multi-domain account

    - by user1571299
    I'm the admin of a handful of domains on Office 365. The thing is some of my clients would like to manage their mailboxes by themselves. So I started looking into it and the closest I got was this page: http://community.office365.com/en-us/forums/158/p/20912/98083.aspx I created a Group Role with a Write Scope according to that post. I also assigned the Reset Password, Recipiants Creation, Mail Recipients and Distribution Groups rolls. But unfortunately that just doesnt work. The user in question is still unable to manage anything. Any suggestions?

    Read the article

  • Converting Visio (.vsd) files to pdf automatically [migrated]

    - by Aseques
    I am trying to create a scheduled task to convert all my .vsd files to pdf so all of our devices can read them (linux, mac, smartphones, etc..) and I would prefer not paying for something that can be done with Visio + PDFcreator. The approach of using openoffice doesn't work with .vsd files since it's not a supported format ( Method/tools for batch-converting Microsoft Word files into PDF?) What I've currently is this: 'C:\Program Files\Microsoft Office\Visio11\VISIO.EXE' /pt "Z:\Archive\Files.vsd",-PPDFCREATORPRINTER /nologo That is able to open automatically the document I want and to prepare it to be printed, the only missing part is that it requires me to confirm on the printing dialog. There's some information here: http://support.microsoft.com/kb/314392 but it doesn't explain abotu non interactive printing.

    Read the article

  • Comparing two specific properties of a CSV using Compare-Object isn't giving the expected results

    - by MDMarra
    I have a list of users from two separate domains. These lists are in CSV format and I only care about the SAMAccountName, which is a field in these CSVs. The code that I'm working with is currently: $domain1 = Import-CSV C:\Scripts\Temp\domain1.xxx.org.csv | Select-Object SAMAccountName $domain2 = Import-CSV C:\Scripts\Temp\domain2.xxx.org.csv | Select-Object SAMAccountName Compare-Object ($domain1) ($domain2) This is returning only a handful of results (which aren't accurate) in this format: @{samaccountname=SomeUser} => Obviously, Compare-Object isn't evaluating the objects as strings. How do I make this work?

    Read the article

  • Client-Side script to upload attachments to the Sharepoint 2007 list

    - by Clone of Anton Makrushin
    Hello. I have no good script-writing experience. So, I have a list created on MOSS 2007 with about 1000 elements and attachments enabled. I need to attach to each list item file (*.jpg) from a local folder. I doesn't have administrator privileges at MOSS server, only contributor rights Here is my script: $web = new-Object system.Net.WebClient $web.Credentials = [System.Net.CredentialCache]::DefaultCredentials $web.Headers.Add("user-agent", "PowerShell Script") $web.UploadFile('http://ruglbsrvsps/IT/Lists/Test1/', 'C:\temp\Attachments\14\Img1.jpg' ) Test1 - target list; Item1, Item2, Item3 - list items, without attachments, created manually When I run script, it returns byte array and does not upload file to the list item. Can you fix my script or advice better solution for my task (attach bulk of files to the MOSS list items, only contributor rights for target Sharepoint 2007 list) Thank you.

    Read the article

  • Get-WMIObject fails when run AsJob

    - by codepoke
    It's a simple question, but it's stumped me. $cred = Get-Credential $jobs = @() $jobs += Get-WmiObject ` -Authentication 6 ` -ComputerName 'serverName' ` -Query 'Select * From IISWebServerSetting' ` -Namespace 'root/microsoftiisv2' ` -EnableAllPrivileges ` -Credential $cred ` -Impersonation 4 ` -AsJob $joblist = Wait-Job -Job $jobs -Timeout 60 foreach ($job in $jobs) { if ($job.State -eq "Completed") { $app = Receive-Job -Job $job $app } else { ("Job not completed: " + $job.Name + "@" + $job.State + ". Reason:" + $job.ChildJobs[0].JobStateInfo.Reason) Remove-Job -Job $job -Force } } The query succeeds when run directly and fails when run -AsJob. Reason:System.UnauthorizedAccessException: Access is denied. I've jiggered with -Impersonation, -Credentials, -Authority, and -EnableAllPrivileges to no useful effect. It appears I'm overlooking something fundamental. Why is my Powershell prompt allowed to connect to the remote server, but my child process denied?

    Read the article

  • How to set permissions on MSMQ Cluster queues?

    - by JorgeSandoval
    I've got a cluster with functioning private MSMQ 3.0 queues. I'm trying to programmatically set the permissions, but can't seem to connect via System.Messaging on the queues. The code below works just fine when working with local queues (and using .\ nomenclature for the local queue). How to programmatically set the permissions on the clustered queues? Powershell code executed from the active node function set-msmqpermission ([string] $queuepath,[string] $account, [string] $accessright) { if (!([System.Messaging.MessageQueue]::Exists($queuepath))){ throw "$queuepath could not be found." } $q=New-Object System.Messaging.MessageQueue($queuepath) $q.SetPermissions($account,[System.Messaging.MessageQueueAccessRights]::$accessright, [System.Messaging.AccessControlEntryType]::Set) } set-msmqpermission "clusternetworkname\private$\qa1ack" "UserAccount" "FullControl" Exception calling "SetPermissions" with "3" argument(s): "Invalid queue path name." At line:30 char:19 + $q.SetPermissions <<<< ($account,[System.Messaging.MessageQueueAccessRights]::$accessright, + CategoryInfo : NotSpecified: (:) [], MethodInvocationException + FullyQualifiedErrorId : DotNetMethodException

    Read the article

  • UI-Automation cmdlet not finding the control

    - by sc_ray
    Hi, I am trying to test a WPF application using the UI-Automation framework that MSFT provides. There were a few powershell scripts written that invoked the cmdlets created to manipulate the visual controls of the application. There is a DropDown within my application that has an entry 'DropDownEntry'. In my cmdlet, I am trying to do something as follows: AutomationElement getItem = DropDown.FindFirst(TreeScope.Descendants, new AndCondition( new PropertyCondition(AutomationElement.ControlTypeProperty,ControlType.ListItem), new PropertyCondition(AutomationElement.NameProperty, "DropDownEntry",PropertyConditionFlags.IgnoreCase))); The above given snippet returns 'null' upon execution which essentially means that the above given logic was unable to find my dropdown entry. Can somebody tell me why this might be happening? I checked the name of my control and the values. Everything seems to be in order. I am not sure why this would be happening. Any help would be much appreciated. Thanks

    Read the article

  • How can I copy files with timestamps between 2 times and preserve the directory structure

    - by brushwood
    I want to copy files that have timestamps from the time the script begins to run and a hour previous. So I am basically trying to emulate robocopy but with minage and maxage going down to the exact time rather than days. So far I have this in powershell: $now = Get-Date $previousHour = $now.AddHours(-1) "Copying files from $previousHour to $now" function DoCopy ($source, $destination) { $fileList = gci $source -Recurse foreach ($file in $fileList) { if($file.LastWriteTime -lt $now -and $file.LastWriteTime -gt $previousHour) { #Do the copy } } } DoCopy "C:\test" "C:\test2" but if I try to do the copy like that, it copies all the files directly into the destination folder rather than the subfolders.enter code here

    Read the article

  • IIS 6.0: what is the WMI equivalent of "Convert Folder To Application"?

    - by Richard Berg
    Basically, imagine you need to automate these steps: Open Internet Information Services. Find the web site, find the folder, right click the folder and select properties. Click the Directory folder. Under Applications settings, click the Create button. The application is now created. Click OK to finish. (click for a pictoral walkthru) My end goal is Powershell, but answers that use JScript or VB.NET or whatever are fine too. Bonus points if the same code works against an IIS7 application running in 6.0 Compatibility Mode.

    Read the article

  • How do I recycle an IIS App pool with Powershell?

    - by Ralph Willgoss
    Reference implementation of a Powershell script to recycle app pools, in response to Rick's post:http://www.west-wind.com/weblog/posts/2012/Oct/02/A-tiny-Utility-to-recycle-an-IIS-Application-Pool#    File: RecycleAppPool.ps1#    Author: Ralph Willgoss#    Date: 2nd Oct 2012#    Reference:#    http://stackoverflow.com/questions/198623/how-do-i-recycle-an-iis-apppool-with-powershell# #    Alternative is to create a Process and run the inbuilt vbs:#    C:\WINDOWS\system32\iisapp.vbs => "IIsApp /a DefaultAppPool /r"#   #    Windows 2003 & II6 C:\WINDOWS\system32>cscript.exe iisapp.vbs /a StaticDataAppPool /r#    Windows 2008 IIS7 [tbd]# =============================================================================#    Iniatialise=============================================================================param ( )=============================================================================#   Main=============================================================================Write-OutPut ""Write-OutPut "Starting Recycling App Pool"Write-OutPut ""$appPoolName = "StaticDataAppPool" #$args[0]$appPool = Get-WmiObject -namespace "root\MicrosoftIISv2" -class "IIsApplicationPool"           | Where-Object { $_.Name -eq "W3SVC/APPPOOLS/$appPoolName" }           $appPool.Recycle()Write-OutPut ""Write-OutPut "Finished Recycling App Pool"Write-OutPut ""

    Read the article

  • Exporting info from PS script to csv

    - by George
    Hi, This is a powershell/AD/Exchange question.... I'm running a script against a number of Users to check some of their attributes however I'm having trouble getting this to output to CSV. The script runs well and does exactly what I need it to do, the output on screen is fine, I'm just having trouble getting it to directly export to csv. The input is a comma seperated txt file of usernames (eg "username1,username2,username3") I've experimented with creating custom ps objects, adding to them and then exporting those but its not working.... Any suggestions gratefully received.. Thanks George $array = Get-Content $InputPath #split the comma delimited string into an array $arrayb = $array.Split(","); foreach ($User in $arrayb) { #find group memebership Write-Host "AD group membership for $User" Get-QADMemberOf $User #Get Mailbox Info Write-Host "Mailbox info for $User" Get-Mailbox $User | select ServerName, Database, EmailAddresses, PrimarySmtpAddress, WindowsEmailAddress #get profile details Write-Host "Home drive info for $User" Get-QADUser $User| select HomeDirectory,HomeDrive #add space between users Write-Host "" Write-Host "******************************************************" } Write-Host "End Script"

    Read the article

  • run windows command from bash with output to standard out?

    - by Wayne
    Folks, I'm using git tools such as git bisect run which need to call a command to build and test my project. My command to do is nant which is a windows program. Or a build.cmd script which calls nant. It's easy to get the bash to call the nant build to run. But the hard part is how to get the standard output written to a file? I even installed the Windows PowerShell to try running a command from bash. Again, it works but the standard output fill says "permission denied" when I try to read it while the build is going on.

    Read the article

  • Programmatically access document properties

    - by ngm
    Is there a way in which I can programmatically access the document properties of a Word 2007 document? I am open to using any language for this, but ideally it might be via a PowerShell script. My overall aim is to traverse the documents somewhere on a filesystem, parse some document properties from these documents, and then collate all of these properties back together into a new Word document. (I essentially want to automatically create a document which is a list of all documents beneath a certain folder of the filesystem; and this list would contain such things as the Title, Abstract and Author document properties; the CreateDate field; etc. for each document)

    Read the article

  • Which tool / technology: System management for databases and dependent services

    - by Filburt
    A follow-up on this system management question: Since I probably will not get much feedback on serverfault I'll give it a try here. My main concern is to reflect the dependencies between the databases, services ans tasks/jobs I'll have to manage. Besides considering Powershell, I even thought about using MSBuild because it would allow for modeling dependencies and reuse configuration targets. In other words: What technology should I use to develop a flexible solution that will allow me to stop service A, B and C on machine D in the right order and disable task E on machine F when taking down database X?

    Read the article

  • can anyone explain the why the the 1st example gets different results than the following 2

    - by klumsy
    $b = (2,3) $myarray1 = @(,$b,$b) $myarray1[0].length #this will be 1 $myarray1[1].length $myarray2 = @( ,$b ,$b ) $myarray2[0].length #this will be 2 $myarray[1].length $myarray3 = @(,$b ,$b ) $myarray3[0].length #this will be 2 $myarray3[1].length UPDATE I think on #powershell IRC we have worked it out, Here is another example that demonstrates the danger of breaking with the comma on the following line rather than the top line when listing multiple items in an array over multiple lines. $b = (1..20) $a = @( $b, $b ,$b, $b, $b ,$b) for($i=0;$i -lt $a.length;$i++) { $a[$i].length } "--------" $a = @( $b, $b ,$b ,$b, $b ,$b) for($i=0;$i -lt $a.length;$i++) { $a[$i].length } produces 20 20 20 20 20 20 -------- 20 20 20 1 20 20 I'm curious how people will explain this. I think i understand it now, but would have trouble explaining it in a concise understandable fashion, though the above example goes somewhat towards that goal.

    Read the article

  • Path with no slash after drive letter and colon - what does it point to?

    - by ya23
    I have mistyped a path and instead of c:\foo.txt wrote c:foo.txt. I expected it to either fail or to resolve to c:\foo.txt, but instead it seems to be resolved to foo.txt in a current user's home folder. Powershell returns: PS C:\> [System.IO.Path]::GetFullPath("c:\foo.txt") c:\foo.txt PS C:\> [System.IO.Path]::GetFullPath("c:foo.txt") C:\Users\Administrator\foo.txt PS C:\> [System.IO.Path]::GetFullPath("g:foo.txt") G:\foo.txt Running explorer.exe from commandline and passing it any of the above results in C:\Users\Administrator\Documents to be opened. I haven't found any documentation of that and I'm utterly confused, please explain the behaviour.

    Read the article

  • Using HttpClient with the RightScale API

    - by Ameer Deen
    I'm trying to use the WCF Rest Starter Kit with the RightScale's Login API which seems fairly simple to use. Edit - Here's a blog entry I wrote on using Powershell to consume the API. Edit - Created a generic .NET wrapper for the RightScale API - NRightAPI It's exactly as simple as it looks while using CURL. In order for me to obtain a login cookie all I need to do is: curl -v -c rightcookie -u username:password "https://my.rightscale.com/api/acct/accountid/login?api_version=1.0" And I receive the following cookie: HTTP/1.1 204 No Content Date: Fri, 25 Dec 2009 12:29:24 GMT Server: Mongrel 1.1.3 Status: 204 No Content X-Runtime: 0.06121 Content-Type: text/html; charset=utf-8 Content-Length: 0 Cache-Control: no-cache Added cookie _session_id="488a8d9493579b9473fbcfb94b3a7b8e5e3" for domain my.rightscale.com, path /, expire 0 Set-Cookie: _session_id=488a8d9493579b9473fbcfb94b3a7b8e5e3; path=/; secure Vary: Accept-Encoding However, when I use the following C# code: HttpClient http = new HttpClient("https://my.rightscale.com/api/accountid/login?api_version=1.0"); http.TransportSettings.UseDefaultCredentials = false; http.TransportSettings.MaximumAutomaticRedirections = 0; http.TransportSettings.Credentials = new NetworkCredential("username", "password"); Console.WriteLine(http.Get().Content.ReadAsString()); Instead of a HTTP 204, I get a redirect: You are being <a> href="https://my.rightscale.com/dashboard"redirected <a> How do I get the WCF REST starter kit working with the RighScale API ?

    Read the article

  • In WMI, can I use a join (or something similar) to acquire the IisWebServer object for a site, given

    - by Precipitous
    Given a server name and a physical path, I'd like to be able to hunt down the IISWebServer object and ApplicationPool. Website url is also an acceptable input. Our technologies are IIS 6, WMI, and access via C# or Powershell 2. I'm certain this would be easier with IIS 7 its managed API. We don't have that yet. Here's what I can do: Get a list of IIS virtual directories from IISWebVirtualDirSetting and filter (offline) for the matching physical path. $theVirtualDir = gwmi -Namespace "root/MicrosoftIISv2" ` -ComputerName $servername -authentication PacketPrivacy ` -class "IISWebVirtualDirSetting" ` | where-object {$_.Path -like $deployLocation} From the virtual directory object, I can get a name (like W3SVC/40565456/root). Given this name, I can get to other goodies, such as the IIS web server object. gwmi -Namespace "root/MicrosoftIISv2" ` -ComputerName $servername ` -authentication PacketPrivacy ` -Query "SELECT * FROM IisWebServer WHERE Name='W3SVC/40589473'" The questions, restated: 1) This is a query language. Can I join or subquery so that 1 WMI query statement gets web servers based on IISWebVirtualDir.Path? How? 2) In solving 1, you'll have to explain how to query on the Path property. Why is this an invalid query? "SELECT * FROM IISWebVirtualDirSetting WHERE Path='D:\sites\globaldominator'"

    Read the article

  • Can Win32_NetworkAdapterConfiguration.EnableStatic() be used to set more than one IP address?

    - by Andrew J. Brehm
    I ran into this problem in a Visual Basic program that uses WMI but could confirm it in PowerShell. Apparently the EnableStatic() method can only be used to set one IP address, despite taking two parameters IP address(es) and subnetmask(s) that are arrays. I.e. $a=get-wmiobject win32_networkadapterconfiguration -computername myserver This gets me an array of all network adapters on "myserver". After selecting a specific one ($a=$a[14] in this case), I can run $a.EnableStatic() which has this signature System.Management.ManagementBaseObject EnableStatic(System.String[] IPAddress, System.String[] SubnetMask) I thought this implies that I could set several IP addresses like this: $ips="192.168.1.42","192.168.1.43" $a.EnableStatic($ips,"255.255.255.0") But this call fails. However, this call works: $a.EnableStatic($ips[0],"255.255.255.0") It looks to me as if EnableStatic() really takes two strings rather than two arrays of strings as parameters. In Visual Basic it's more complicated and arrays must be passed but the method appears to take into account only the first element of each array. Am I confused again or is there some logic here?

    Read the article

  • In need of a Smarter Environmental Package Configuration

    - by Jeremy Liberman
    I am trying to set up a package template in SSIS, following the Wrox Programmer to Programmer book, SQL Server 2008 Integration Services: Problem - Design - Solution. I'm really liking this book even though it is 2008 and we're using SQL Server 2005. I've got a working package template that uses an Indirect XML package configuration to identify what environment (local developer, dev, QA, production, etc) the package is being run in. That locates the SQL Server package configuration for the environment. That set-up is great and all except for the environment variable at the very front of it all. My team would prefer it if the package could use the same environment resource locator as all our other applications and tools use, so we don't two environment markers with essentially the same information in them. Normally we look up a registry key in HKey_Local_Machine but the Registry Package Configuration type only lets you look up the HKey_Current_User registries. My first thought was to write a new Package Configuration Type class that extends the Registry type; after all we'd had such luck writing our own custom log provider. SSIS is super extendable, right? So there doesn't seem to be a way to write your own Package Configuration Types. Is there still some way I can configure my SSIS SQL Server package configuration from a HKLM registry key connection string? If this is not possible, what other workarounds are available? My idea is to write a PowerShell script that will create/modify the Environment Variable that the package will use by fetching the connection string from the registry. This way there's still two markers, but at least then it's automatically maintained and automated. Is this kind of workaround necessary? Thank you for your time.

    Read the article

  • How to update all the SSIS packages&rsquo; Connection Managers in a BIDS project with PowerShell

    - by Luca Zavarella
    During the development of a BI solution, we all know that 80% of the time is spent during the ETL (Extract, Transform, Load) phase. If you use the BI Stack Tool provided by Microsoft SQL Server, this step is accomplished by the development of n Integration Services (SSIS) packages. In general, the number of packages made ??in the ETL phase for a non-trivial solution of BI is quite significant. An SSIS package, therefore, extracts data from a source, it "hammers" :) the data and then transfers it to a specific destination. Very often it happens that the connection to the source data is the same for all packages. Using Integration Services, this results in having the same Connection Manager (perhaps with the same name) for all packages: The source data of my BI solution comes from an Helper database (HLP), then, for each package tha import this data, I have the HLP Connection Manager (the use of a Shared Data Source is not recommended, because the Connection String is wired and therefore you have to open the SSIS project and use the proper wizard change it...). In order to change the HLP Connection String at runtime, we could use the Package Configuration, or we could run our packages with DTLoggedExec by Davide Mauri (a must-have if you are developing with SQL Server 2005/2008). But my need was to change all the HLP connections in all packages within the SSIS Visual Studio project, because I had to version them through Team Foundation Server (TFS). A good scribe with a lot of patience should have changed by hand all the connections by double-clicking the HLP Connection Manager of each package, and then changing the referenced server/database: Not being endowed with such virtues :) I took just a little of time to write a small script in PowerShell, using the fact that a SSIS package (a .dtsx file) is nothing but an xml file, and therefore can be changed quite easily. I'm not a guru of PowerShell, but I managed more or less to put together the following lines of code: $LeftDelimiterString = "Initial Catalog=" $RightDelimiterString = ";Provider=" $ToBeReplacedString = "AstarteToBeReplaced" $ReplacingString = "AstarteReplacing" $MainFolder = "C:\MySSISPackagesFolder" $files = get-childitem "$MainFolder" *.dtsx `       | Where-Object {!($_.PSIsContainer)} foreach ($file in $files) {       (Get-Content $file.FullName) `             | % {$_ -replace "($LeftDelimiterString)($ToBeReplacedString)($RightDelimiterString)", "`$1$ReplacingString`$3"} ` | Set-Content $file.FullName; } The script above just opens any SSIS package (.dtsx) in the supplied folder, then for each of them goes in search of the following text: Initial Catalog=AstarteToBeReplaced;Provider= and it replaces the text found with this: Initial Catalog=AstarteReplacing;Provider= I don’t enter into the details of each cmdlet used. I leave the reader to search for these details. Alternatively, you can use a specific object model exposed in some .NET assemblies provided by Integration Services, or you can use the Pacman utility: Enjoy! :) P.S. Using TFS as versioning system, before running the script I checked out the packages and, after the script executed succesfully, I checked in them.

    Read the article

  • Replacement for Vern Buerg's list.com in 64 bit Windows 7

    - by Kevin
    I would like to find a replacement for list.com, specifically the ability to accept piped input. For example: p4 sync -n | list which accepts the output of the perforce command and displays the results in the viewer/editor for manipulation or saving. I know that I would send the output to a file and then open the file in the viewer/editor but I use it for temporary results. List.com doesn't work on 64 bit Windows 7.

    Read the article

  • how to suppress output in R

    - by Derek
    Hi, I would like to suppress output in R when I run my r script from the command prompt. I tried numerous options including "--slave" and "--vanilla". Those options lessens the amount of text outputted. I also tried to pipe the output to "NUL" but that didn't help. Thanks a lot, Derek

    Read the article

  • Calling via adb in Power shell

    - by Imran Nasir
    As you may know, the command for calling via adb is: .\adb.exe shell am start -a android.intent.action.CALL tel:"656565" This works well but when I use textbox, it takes garbage value... .\adb.exe shell am start -a android.intent.action.CALL tel:$textbox1.Text I have tried this also but failed $button21_Click={ #TODO: Place custom script here $textbox1.Clear .\adb.exe shell am start -a android.intent.action.CALL tel:$textbox1.Text } Please help

    Read the article

< Previous Page | 36 37 38 39 40 41 42 43 44 45 46 47  | Next Page >