Search Results

Search found 25534 results on 1022 pages for 'write powershell'.

Page 17/1022 | < Previous Page | 13 14 15 16 17 18 19 20 21 22 23 24  | Next Page >

  • Powershell and some simple string manipulation

    - by Pat
    need some help with building a powershell script to help with some basic string manipulation. I know just enough powershell to get in trouble, but can't figure out the syntax or coding to make this work. I have a text file that looks like this - Here is your list of servers: server1 server2.domain.local server3 Total number of servers: 3 I need to take that text file and drop the first and last lines (Always first and last.) Then I need to take every other line and basically turn it into a CSV file. The final output should be a text file that looks like this - server1,server2.domain.local,server3 Any suggestions on where to start? Thanks!

    Read the article

  • TeamCity's build agent unable to find PowerShell

    - by cincura.net
    I have TC's build agent installed on Windows 2008 R2 SP1 Core. The server has PowerShell 2.0 installed (double checked, and actually from PS downloaded the TC installation). Looking at some build configuration I see these being incompatible with this agent, because powershell_x86/powershell_x64 is required. I tried deleting build agents dirs to force upgrade, but no luck. Interestingly if I provide powershell_x86, powershell_x86_Path (and for 64bit) variables into config file manually, everything runs fine. Is there anything I can do to have the build agent find PowerShell automatically? What/where is it looking for it? Maybe the 'Core' is problem.

    Read the article

  • Social Media Aggregator, Global Update via Powershell

    - by deanjmiller
    Does anyone know of a way to interface with a Social Media Aggregator using Powershell. For Instance, I would like to update my global status on digsby using Powershell. Digsby would then fan the message out to Facebook, Myspace, Twitter, Etc.. I am open to using any Social Media Aggregator that can do this.. Digsby, Seesmic, Ping.fm TweetDeek, etc.. If any of these programs have a com interface or something like it I'm sure who ever implements this first will have a large gain in users.

    Read the article

  • PHP Put list from Powershell into Array

    - by Mezzan
    Code: $exchangesnapin = "Add-PSSnapin Microsoft.Exchange.Management.PowerShell.E2010"; $output = shell_exec('powershell '.$exchangesnapin.';"get-mailboxdatabase" 2>&1'); echo( '<pre>' ); echo( $output ); echo( '</pre>' ); Result: Name Server Recovery ReplicationType ---- ------ -------- --------------- Mailbox Database 0651932265 EGCVMADTEST False None Mailbox Database 0651932266 EGCVMADTEST False None I tried with echo( $output[1] ); The result was only a letter 'N'. I believe its taking the Name column but one character at a time. $output[1] is 'N', $output[2] is 'a'. Is there any way I can get the mailbox list into array?

    Read the article

  • [PowerShell] Sql Server SMO connection timeout not working

    - by Uros Calakovic
    I have the following PowerShell code: function Get-SmoConnection { param ([string] $serverName = "", [int] $connectionTimeout = 0) if($serverName.Length -eq 0) { $serverConnection = New-Object ` Microsoft.SqlServer.Management.Common.ServerConnection } else { $serverConnection = New-Object ` Microsoft.SqlServer.Management.Common.ServerConnection($serverName) } if($connectionTimeout -ne 0) { $serverConnection.ConnectTimeout = $connectionTimeout } try { $serverConnection.Connect() $serverConnection } catch [system.Management.Automation.MethodInvocationException] { $null } } $connection = get-smoconnection "ServerName" 2 if($connection -ne $null) { Write-Host $connection.ServerInstance Write-Host $connection.ConnectTimeout } else { Write-Host "Connection could not be established" } It seems to work, except for the part that attempts to set the SMO connection timeout. If the connection is successful, I can verify that ServerConnection.ConnectTimeout is set to 2 (seconds), but when I supply a bogus name for the SQL Server instance, it still attempts to connect to it for ~ 15 seconds (which is I believe the default timeout value). Does anyone have experience with setting SMO connection timeout? Thank you in advance.

    Read the article

  • Muliple Foreground Colors in Powershell in One Command.

    - by Mark Tomlin
    I want to output many different foreground colors with one statement. PS C:\> Write-Host "Red" -ForegroundColor Red Red This output is red. PS C:\> Write-Host "Blue" -ForegroundColor Blue Blue This output is blue. PS C:\> Write-Host "Red", "Blue" -ForegroundColor Red, Blue Red Blue This output is magenta, but I want the color to be Red for the word red, and blue for the word blue via the one command. How can I do that?

    Read the article

  • Run a SQL Script Against MySQL using Powershell

    - by abarr
    I have a Powershell script that backs up my MySQL DB's each night using mysqldump. This all works fine but I would like to extend the script to update a reporting db (db1) from the backup of the prod db (db2). I have written the following test script but it does not work. I have a feeling the problem is the reading of the sql file to the CommandText but I am not sure how to debug. [system.reflection.assembly]::LoadWithPartialName("MySql.Data") $mysql_server = "localhost" $mysql_user = "root" $mysql_password = "password" write-host "Create coonection to db1" # Connect to MySQL database 'db1' $cn = New-Object -TypeName MySql.Data.MySqlClient.MySqlConnection $cn.ConnectionString = "SERVER=$mysql_server;DATABASE=db1;UID=$mysql_user;PWD=$mysql_password" $cn.Open() write-host "Running backup script against db1" # Run Update Script MySQL $cm = New-Object -TypeName MySql.Data.MySqlClient.MySqlCommand $sql = Get-Content C:\db2.sql $cm.Connection = $cn $cm.CommandText = $sql $cm.ExecuteReader() write-host "Closing Connection" $cn.Close() Any assistance would be appreciated. Thanks.

    Read the article

  • UNRAID V4.7: Lost write permission on Win7/Android devices

    - by JROC
    I'm currently running V4.7 and I haven't touched any of the user or share settings, and I'm periodically losing read.write permission on both my windows 7 pc and my android tablet connecting over the wireless. Sometime I can access my shares and see the folder directories, but when attempting to open a folder windows denies me access saying I don't have the proper permission. This is after I have logged in with my main account that has full read/write access of everything, same on my android device. This all started when I attempted to delete a large amount of files (8gb) to make more room and about half way through started getting permission errors. What could be causing this? Thanks

    Read the article

  • Cannot write, format, nor erase flash drive

    - by Baruch
    I have a 4gig flash drive, which I can read from but not write on, erase, nor format in any computer (tried in 7 different computers including xp, vista, and win7). I want to erase all the data inside because it is useless. Once I erase something, it erases it and gives me an error that it can't find the file. Once I refresh the folder, the file comes back. I tried holding Shift + Del. I also tried to use the command to format it in safe mode, but it says "access denied". I don't have a write protection button or whatever it is on my flash drive. It's just a simple small 4 gig one.

    Read the article

  • Running Powershell from within SharePoint

    - by Norgean
    Just because something is a daft idea, doesn't mean it can't be done. We sometimes need to do some housekeeping - like delete old files or list items or… yes, well, whatever you use Powershell for in a SharePoint world. Or it could be that your solution has "issues" for which you have Powershell solutions, but not the budget to transform into proper bug fixes. So you create a "how to" for the ITPro guys. Idea: What if we keep the scripts in a list, and have SharePoint execute the scripts on demand? An announcements list (because of the multiline body field). Warning! Let us be clear. This list needs to be locked down; if somebody creates a malicious script and you run it, I cannot help you. First; we need to figure out how to start Powershell scripts from C#. Hit teh interwebs and the Googlie, and you may find jpmik's post: http://www.codeproject.com/Articles/18229/How-to-run-PowerShell-scripts-from-C. (Or MS' official answer at http://msdn.microsoft.com/en-us/library/ee706563(v=vs.85).aspx) public string RunPowershell(string powershellText, SPWeb web, string param1, string param2) { // Powershell ~= RunspaceFactory - i.e. Create a powershell context var runspace = RunspaceFactory.CreateRunspace(); var resultString = new StringBuilder(); try { // load the SharePoint snapin - Note: you cannot do this in the script itself (i.e. add-pssnapin etc does not work) PSSnapInException snapInError; runspace.RunspaceConfiguration.AddPSSnapIn("Microsoft.SharePoint.PowerShell", out snapInError); runspace.Open(); // set a web variable. runspace.SessionStateProxy.SetVariable("webContext", web); // and some user defined parameters runspace.SessionStateProxy.SetVariable("param1", param1); runspace.SessionStateProxy.SetVariable("param2", param2); var pipeline = runspace.CreatePipeline(); pipeline.Commands.AddScript(powershellText); // add a "return" variable pipeline.Commands.Add("Out-String"); // execute! var results = pipeline.Invoke(); // convert the script result into a single string foreach (PSObject obj in results) { resultString.AppendLine(obj.ToString()); } } finally { // close the runspace runspace.Close(); } // consider logging the result. Or something. return resultString.ToString(); } Ok. We've written some code. Let us test it. var runner = new PowershellRunner(); runner.RunPowershellScript(@" $web = Get-SPWeb 'http://server/web' # or $webContext $web.Title = $param1 $web.Update() $web.Dispose() ", null, "New title", "not used"); Next step: Connect the code to the list, or more specifically, have the code execute on one (or several) list items. As there are more options than readers, I'll leave this as an exercise for the reader. Some alternatives: Create a ribbon button that calls RunPowershell with the body of the selected itemsAdd a layout pageSpecify list item from query string (possibly coupled with content editor webpart with html that links directly to this page with querystring)WebpartListing with an "execute" columnList with multiselect and an execute button Etc!Now that you have the code for executing powershell scripts, you can easily expand this into a timer job, which executes scripts at regular intervals. But if the previous solution was dangerous, this is even worse - the scripts will usually be run with one of the admin accounts, and can do pretty much anything...One more thing... Note that as this is running "consoleless" calls to Write-Host will fail. Two solutions; remove all output, or check if the script is run in a console-window or not.  if ($host.Name -eq "ConsoleHost") { Write-Host 'If I agreed with you we'd both be wrong' }

    Read the article

  • Powershell: Install-dotNET4 function

    - by marc dekeyser
    This function will download and install ,NET 4.0. It uses the Get-Framework-Versions function to determine if the installation is necessary or not. Internet Connectivity will be required as the script auto downloads the setup file (and sleeps for 360 seconds... I had a function in there to monitor for install completion at first, turns out the setup file spawns so many childprocesses the function just got confused and locked up -_-)Alternatively you could drop the installation file in the folder specified on the $folderPath variable too. That will skip the download and use the file. This function easily adapts in to other versions f.e. I use it for Powershell 3 installs as well!Function install-dotNet4 () {    if(($InstalledDotNET -eq "4.0") -or ($InstalledDotNET -eq "4.0c")){        write-host ".NET 4.0 Framework is already installed" -foregroundcolor Green    } else{            #set a var for the folder you are looking for        $folderPath = 'C:\Temp'        #Check if folder exists, if not, create it        if (Test-Path $folderpath){            Write-Host "The folder $folderPath exists." -ForeGroundColor Green        } else{            Write-Host "The folder $folderPath does not exist, creating..." -NoNewline -ForegroundColor Red            New-Item $folderpath -type directory | Out-Null            Write-Host " - done!" -ForegroundColor Green        }        # Check if file exists, if not, download it        $file = $folderPath+"\dotNetFx40_Full_x86_x64.exe"        if (Test-Path $file){            write-host "The file $file exists." -ForeGroundColor Green        } else {            #Download Microsoft .Net 4.0 Framework            Write-Host "Downloading Microsoft .Net 4.0 Framework..." -nonewline -ForeGroundColor DarkYellow            $clnt = New-Object System.Net.WebClient            $url = "http://download.microsoft.com/download/9/5/A/95A9616B-7A37-4AF6-BC36-D6EA96C8DAAE/dotNetFx40_Full_x86_x64.exe"            $clnt.DownloadFile($url,$file)            Write-Host " - done!" -ForegroundColor Green        }        #Install Microsoft .Net Framework        Write-Host "Installing Microsoft .Net Framework..." -nonewline -ForegroundColor DarkYellow        $dotNET4 = $folderPath+"\dotNetFx40_Full_x86_x64.exe /quiet /norestart"        Invoke-Expression $dotNET4        write-host " - done!" -ForegroundColor Green        start-sleep -seconds 360    }}

    Read the article

  • How can I read the verbose output from a Cmdlet in C# using Exchange Powershell

    - by mrkeith
    Environment: Exchange 2007 sp3 (2003 sp2 mixed mode) Visual Studio 2008, .Net 3.5 Hello, I'm working with an Exchange powershell move-mailbox cmdlet and have noted when I do so from the Exchange Management shell (using the Verbose switch) there is a ton of real-time information provided. To provide a little context, I'm attempting to create a UI application that moves mailboxes similarly to the Exchange Management Console but desire to support an input file and specific server/database destinations for each entry (and threading). Here's roughly what I have at present but I'm not sure if there is an event I need to register for or what... And to be clear, I desire to get this information in real-time so I may update my UI to reflect what's occurring in the move sequence for the appropriate user (pretty much like the native functionality offered in the Management Console). And in case you are wondering, the reason why I'm not content with the Management Console functionality is, I have an algorithm which I'm using to balance users depending on storage limit, Blackberry use, journaling, exception mailbox size etc which demands user be mapped to specific locations... and I do not desire to create many/several move groups for each common destination or to hunt for lists of users individually through the management console UI. I can not seem to find any good documentation or examples of how to tie into reading the verbose messages that are provided within the console using C# (I see value in being able to read this kind of information in many different scenarios). I've explored the Invoke and InvokeAsync methods and the StateChanged & DataReady events but none of these seem to provide the information (verbose comments) that I'm after. Any direction or examples that can be provided will be very appreciated! A code sample which is little more than how I would ordinarily call any other powershell command follows: // config to use ExMgmt shell, create runspace and open it RunspaceConfiguration rsConfig = RunspaceConfiguration.Create(); PSSnapInException snapInException = null; PSSnapInInfo info = rsConfig.AddPSSnapIn("Microsoft.Exchange.Management.PowerShell.Admin", out snapInException); if (snapInException != null) throw snapInException; Runspace runspace = RunspaceFactory.CreateRunspace(rsConfig); try { runspace.Open(); // create a pipeline and feed script text Pipeline pipeline = runspace.CreatePipeline(); string targetDatabase = @"myServer\myStorageGroup\myDB"; string mbxOwner = "[email protected]"; Command myMoveMailbox = new Command("Move-Mailbox", false, false); myMoveMailbox.Parameters.Add("Identity", mbxOwner); myMoveMailbox.Parameters.Add("TargetDatabase", targetDatabase); myMoveMailbox.Parameters.Add("Verbose"); myMoveMailbox.Parameters.Add("ValidateOnly"); myMoveMailbox.Parameters.Add("Confirm", false); pipeline.Commands.Add(myMoveMailbox); System.Collections.ObjectModel.Collection output = null; // these next few lines that are commented out are where I've tried // registering for events and calling asynchronously but this doesn't // seem to get me anywhere closer // //pipeline.StateChanged += new EventHandler(pipeline_StateChanged); //pipeline.Output.DataReady += new EventHandler(Output_DataReady); //pipeline.InvokeAsync(); //pipeline.Input.Close(); //return; tried these variations that are commented out but none seem to be useful output = pipeline.Invoke(); // Check for errors in the pipeline and throw an exception if necessary if (pipeline.Error != null && pipeline.Error.Count 0) { StringBuilder pipelineError = new StringBuilder(); pipelineError.AppendFormat("Error calling Test() Cmdlet. "); foreach (object item in pipeline.Error.ReadToEnd()) pipelineError.AppendFormat("{0}\n", item.ToString()); throw new Exception(pipelineError.ToString()); } foreach (PSObject psObject in output) { // blah, blah, blah // this is normally where I would read details about a particular PS command // but really pertains to a command once it finishes and has nothing to do with // the verbose messages that I'm after... since this part of the methods pertains // to the after-effects of a command having run, I'm suspecting I need to look to // the asynch invoke method but am not certain or knowing how. } } finally { runspace.Close(); } Thanks! Keith

    Read the article

  • get a set of files that have been modified after a certain date

    - by jcollum
    Does anyone have a handy powershell script that gets a set of files from TFS based on a modification date? I'd like to say "give me all the files in this folder (or subfolder) that were modified after X/Y/ZZZZ" and dump those files to a folder other than the folder they would normally go to. I know enough powershell to hack about and get this done, eventually, but I'm hoping to avoid that.

    Read the article

  • How do I set up TFS PowerShell Snapin

    - by TheSean
    I have installed TFS Power Tools and I am trying to use the powershell snapin, but I can't figure out how to set it up. When I look in the install folder, I only see the following 5 dlls. Microsoft.TeamFoundation.PowerToys.Client.dll Microsoft.TeamFoundation.PowerToys.Common.dll Microsoft.TeamFoundation.PowerToys.Controls.dll Microsoft.VisualStudio.TeamFoundation.PowerToys.Common.dll Microsoft.VisualStudio.TeamFoundation.PowerToys.dll I used instalutil to install each one, and then I used the folowing ps code to see what cmdlets where installed so I could add the snapin but it looks like only a handfull exist in those dlls and these commands are not useful to me right now. PS H:\> get-pssnapin -registered Name : TfsBPAPowerShellSnapIn PSVersion : 1.0 Description : This is a PowerShell snap-in that includes Team Foundation Server cmdlets. PS H:\> get-command -pssnapin TfsBPAPowerShellSnapIn CommandType Name Definition ----------- ---- ---------- Cmdlet Get-MsiProductId Get-MsiProductId [[-ProductIndex] <Int32>] [[-Mo... Cmdlet Get-TfsDBServer Get-TfsDBServer [[-DBPath] <String>] [-Verbose] ... Cmdlet Get-TfsHealthPing Get-TfsHealthPing [-Verbose] [-Debug] [-ErrorAct... Cmdlet Get-TfsSqlData Get-TfsSqlData [[-ConnectionBuilder] <SqlConnect... thanks.

    Read the article

  • Vim with Powershell

    - by Kevin Berridge
    I'm using gvim on Windows. In my _vimrc I've added: set shell=powershell.exe set shellcmdflag=-c set shellpipe=> set shellredir=> function! Test() echo system("dir -name") endfunction command! -nargs=0 Test :call Test() If I execute this function (:Test) I see nonsense characters (non number/letter ASCII characters). If I use cmd as the shell, it works (without the -name), so the problem seems to be with getting output from powershell into vim. Interestingly, this works great: :!dir -name As does this: :r !dir -name UPDATE: confirming behavior mentioned by David If you execute the set commands mentioned above in the _vimrc, :Test outputs nonsense. However, if you execute them directly in vim instead of in the _vimrc, :Test works as expected. Also, I've tried using iconv in case it was an encoding problem: :echo iconv( system("dir -name"), "unicode", &enc ) But this didn't make any difference. I could be using the wrong encoding types though. Anyone know how to make this work?

    Read the article

  • Matching a Repeating Sub Series using a Regular Expression with PowerShell

    - by Hinch
    I have a text file that lists the names of a large number of Excel spreadsheets, and the names of the files that are linked to from the spreadsheets. In simplified form it looks like this: "Parent File1.xls" Link: ChildFileA.xls Link: ChildFileB.xls "ParentFile2.xls" "ParentFile3.xls" Blah Link: ChildFileC.xls Link: ChildFileD.xls More Junk Link: ChildFileE.xls "Parent File4.xls" Link: ChildFileF.xls In this example, ParentFile1.xls has embedded links to ChildFileA.xls and ChildFileB.xls, ParentFile2.xls has no embedded links, and ParentFile3.xls has 3 embedded links. I am trying to write a regular expression in PowerShell that will parse the text file producing output in the following form: ParentFile1.xls:ChildFileA.xls,ChildFileB.xls ParentFile3.xls:ChildFileC.xls,ChildFileD.xls,ChildFileE.xls etc The task is complicated by the fact that the text file contains a lot of junk between each of the lines, and a parent may not always have a child. Furthermore, a single file name may pass over multiple lines. However, it's not as bad as it sounds, as the parent and child file names are always clearly demarcated (the parent with quotes and the child with a prefix of Link: ). The PowerShell code I've been using is as follows: $content = [string]::Join([environment]::NewLine, (Get-Content C:\Temp\text.txt)) $regex = [regex]'(?im)\s*\"(.*)\r?\n?\s*(.*)\"[\s\S]*?Link: (.*)\r?\n?' $regex.Matches($content) | %{$_.Groups[1].Value + $_.Groups[2].Value + ":" + $_.Groups[3].Value} Using the example above, it outputs: ParentFile1.xls:ChildFileA.xls ParentFile2.xls""ParentFile3.xls:ChildFileC.xls ParentFile4.xls:ChildFileF.xls There are two issues. Firstly, the inclusion of the "" instead of a newline whenever a Parent without a Child is processed. And the second issue, which is the most important, is that only a single child is ever shown for each parent. I'm guessing I need to somehow recursively capture and display the multiple child links that exist for each parent, but I'm totally stumped as to how to do this with a regular expression. Amy help would be greatly appreciated. The file contains 100's of thousands of lines, and manual processing is not an option :)

    Read the article

  • Powershell scripts to backup SQL, SVN

    - by bszom
    I'm trying to use PowerShell to create some backups, and then to copy these to a web folder (or, in other words, upload them to a WebDAV share). At first I thought I'd do the WebDAV stuff from within PowerShell, but it seems this still requires a fair amount of "manual labour", ie: constructing HTTP requests. I then settled for creating a web folder from the script and letting Windows handle the WebDAV stuff. It seems that all it takes to create a web folder is to create a standard shortcut, as described here. What I can't figure out is how to actually copy files to the shortcut's target..? Maybe I'm going about this the wrong way. It would be ideal if I could somehow encrypt the credentials for the WebDAV in the script, then have it create the web folder, shunt over the files, and delete the web folder again. Or even better, not use a web folder at all. Third option would be to just create the web folder manually and leave it there, though I'd rather not. Any ideas/pointers/tips? :)

    Read the article

  • How to accept confirmation Automatically in PowerShell for Outlook

    - by user2919845
    How to accept confirmation Automatically in PowerShell for Outlook I have script for Export attachments from email from Outlook - see next It works correctly on one PC, but on another PC is there a problem: Outlook gives message and wants answer: Permit Denny Help If I manually click on Permit or Denny it works correctly. I want to automate it. Can you give me some suggestion how to do it in PowerShell? I have tried to set Outlook to not give this message but I didn’t success. My script: # <-- Script ---------> # script works with outlook Inbox folder # check if email have attachments with ".txt" and save those attachments to $filepath # path for exported files - attachments $filepath = "d:\Exported_files\" # create object outlook $o = New-Object -comobject outlook.application $n = $o.GetNamespace("MAPI") # $f - folder „dorucena posta“ 6 - Inbox $f = $n.GetDefaultFolder(6) # 6 - Inbox # select newest 10 emails, from it olny this one with attachments $f.Items| select -last 10| Where {$_.Attachments}| foreach { # process only unreaded mail if($_.unread -eq $True) { # processed mail set as read, not to process this mail again next day $_.unread = $False $SenderName = $_.SenderName Write-Host "Email from: ", $SenderName # process all attachments $_.attachments|foreach { $a = $_.filename If ($a.Contains(".txt")) { Write-Host $SenderName," ", $a # copy *.txt attachments to folder $filepath $_.saveasfile((Join-Path $filepath "$a")) } } } } Write-Host "Finish" # <------ End Script ---------------------------------->

    Read the article

  • Powershell: splatting after passing hashtable by reference

    - by user1815871
    Powershell newbie ... I recently learned about splatting — very useful. I ran into a snag when I passed a hash table by reference to a function for splatting purposes. (For brevity's sake — a silly example.) Function AllMyChildren { param ( [ref]$ReferenceToHash } get-childitem @ReferenceToHash.Value # etc.etc. } $MyHash = @{ 'path' = '*' 'include' = '*.ps1' 'name' = $null } AllMyChildren ([ref]$MyHash) Result: an error ("Splatted variables cannot be used as part of a property or array expression. Assign the result of the expression to a temporary variable then splat the temporary variable instead."). Tried this afterward: $newVariable = $ReferenceToHash.Value get-childitem @NewVariable That did work and seemed right per the error message. But: is it the preferred syntax in a case like this? (An oh, look, it actually worked solution isn't always a best practice. My approach here strikes me as "Perl-minded" and perhaps in Powershell passing by value is better, though I don't yet know the syntax for it w.r.t. a hash table.)

    Read the article

  • powershell missing member methods in array

    - by Andrew
    Hi Guys I have (yet another) powershell query. I have an array in powershell which i need to use the remove() and split commands on. Normally you set an array (or variable) and the above methods exist. On the below $csv2 array both methods are missing, i have checked using the get-member cmd. How can i go about using remove to get rid of lines with nan. Also how do i split the columns into two different variables. at the moment each element of the array displays one line, for each line i need to convert it into two variables, one for each column. timestamp Utilization --------- ----------- 1276505880 2.0763250000e+00 1276505890 1.7487730000e+00 1276505900 1.6906890000e+00 1276505910 1.7972880000e+00 1276505920 1.8141900000e+00 1276505930 nan 1276505940 nan 1276505950 0.0000000000e+00 $SystemStats = (Get-F5.iControl).SystemStatistics $report = "c:\snmp\data" + $gObj + ".csv" ### Allocate a new Query Object and add the inputs needed $Query = New-Object -TypeName iControl.SystemStatisticsPerformanceStatisticQuery $Query.object_name = $i $Query.start_time = $startTime $Query.end_time = 0 $Query.interval = $interval $Query.maximum_rows = 0 ### Make method call passing in an array of size one with the specified query $ReportData = $SystemStats.get_performance_graph_csv_statistics( (,$Query) ) ### Allocate a new encoder and turn the byte array into a string $ASCII = New-Object -TypeName System.Text.ASCIIEncoding $csvdata = $ASCII.GetString($ReportData[0].statistic_data) $csv2 = convertFrom-CSV $csvdata $csv2

    Read the article

  • Outlook 2007 Backup to D:\Outlook Fails - Access Denied, Write-Protected or File In Use

    - by nicorellius
    I can successfully save the Outlook PST file to the default location on the C drive (C:\Documents and Settings\user\ ... \Outlook) but when I change the backup save to directory to Outlook on the D drive I get the error: Cannot copy Outlook: Access is denied. Make sure the disk is not full or write protected and that the file is not currently in use. I suppose it is not that crucial that I save this file here, but I have never seen this problem before and I have made this same change in the past. I did some searching in this knowledge exchange as well as elsewhere on changing permissions, etc, but this didn't help. I discovered that the folder on my D drive (called Outlook) is not write-protected and nor is it read-only, as I can save to and modify files in that directory, as well as rename and delete the directory itself. At the time when I installed this version of Outlook, I used a previously saved Personal Folder (a backup PST file) and I thought having this still open in Outlook was causing the trouble. But I closed it and still have the same problem. I know this is probably a silly error on my part but I would like to figure it out. I'm new to superuser, but the answers I see are usually very good, so I thought I would post my first question. Thanks in advance.

    Read the article

  • Get-QADComputer -LdapFilter & variables

    - by dboftlp
    Can I use a variable in and LdapFilter with Get-QADComputer? i.e.: $31DaysAgo = (Get-Date).AddDays(-31) $ft = $31DaysAgo.ToFileTime() $StComps = Get-QADComputer -SizeLimit 0 -IncludeAllProperties -SearchRoot ` 'DC=MY,DC=DOMAIN,DC=LOCAL' -LdapFilter '(&(objectcategory=computer) ` (pwdLastSet<=$ft)(|(operatingsystem=Windows 2000 Professional) ` (operatingSystem=Windows XP*)(operatingSystem=Windows 7*) ` (operatingSystem=Windows Vista*)(operatingsystem=Windows 2000 Server) ` (operatingsystem=Windows Server*)))' If not, how else can I filter out the pwdLastSet filter? Should I just do it after in a pipe? i.e.: $StComps = Get-QADComputer -SizeLimit 0 -IncludeAllProperties -SearchRoot ` 'DC=MY,DC=DOMAIN,DC=LOCAL' -LdapFilter '(&(objectcategory=computer) ` (|(operatingsystem=Windows2000 Professional)(operatingSystem=Windows XP*) ` (operatingSystem=Windows7*)(operatingSystem=Windows Vista*) ` (operatingsystem=Windows 2000 Server)(operatingsystem=Windows Server*)))' ` | Where {$_.pwdLastSet -gt $ft} or even | Where {$_.LastLogonTimeStamp -gt $ft} I know this is going to be slower, but if I have to, I'll go this route. Also, if anyone know's off the top how to time how long a code snippet would take to run, that hint would be greatly appreciated =) ktxbye Thanks, -dboftlp

    Read the article

  • USB Hardware vs. Software Write Lock

    - by TreyK
    I'm in the market for a USB flash drive, and remember this cool feature a tiny 32MB flash drive of mine had: a write lock switch. This seemed like it would be an amazing feature to have as a shield against any nastiness happening to the drive on an unfamiliar computer. However, very few drives on the market offer this feature. Instead, it seems that forms of software protection are the more prominent method. This software protection causes me a bit of uneasiness, as it seems like this software wouldn't be nearly as bulletproof as a physical switch. Also, levels of protection seem to vary from product to product. Being able to protect certain folders from reading and/or writing would be nice, but is the security trade-off worth it? Just how effective can this software protection be? Wouldn't a simple format be able to clean any drive with software protection? My drive must also be compatible with Windows XP, Vista, and 7, as well as Linux and Mac. What would be the best way forward for getting a well-sized (~8GB) flash drive with a strong write protection implementation, for little or no more than a regular drive? Thanks.

    Read the article

  • Strange performance differences in read/write from/to USB flash drive

    - by Mario De Schaepmeester
    When copying files from my 8GB USB 2.0 flash drive with Windows 7 to a traditional hard drive, the average speed is between 25 and 30 MB/s. When doing the reverse, copying to the USB drive, the speed is 5MB/s average. I have tested this with about 4.5GB of files, a mixture of smaller and larger ones. The observations were the same on both FAT32 and exFAT file systems on the USB drive, NTFS on the internal hard disk. I don't think I can be mistaken in saying that flash memory has a lot higher performance than a spinning hard drive in both terms of reading and writing. For both memory types, reading should be faster than writing too. Now I wonder, how can it be that copying files from a fast read memory to a faster write memory is actually slower than copying files from a fast read memory to a slow write memory? I think that the files are stored in RAM before being copied over too, and there's caching as well, but I don't see how even that could tip the balance. It can only be in the advantage of writing to the USB drive, since it is "closer" to the SATA system than the USB port and it will receive data from the internal SATA HDD faster. Perhaps my way of thinking is all wrong or it just depends on the manufacturer of the USB pen. But I am curious.

    Read the article

< Previous Page | 13 14 15 16 17 18 19 20 21 22 23 24  | Next Page >