Search Results

Search found 7423 results on 297 pages for 'powershell community exte'.

Page 98/297 | < Previous Page | 94 95 96 97 98 99 100 101 102 103 104 105  | Next Page >

  • Thanks All the readers and community and Happy new year to all of you.

    - by Jalpesh P. Vadgama
    This is my first blog post for new year 2011 and I would like to take this opportunity to thank all the readers for making my blog very successful and accepting me a community member. As year 2010 has lots of up down in IT filed it was recession period and now we almost recovered from it. Personally year 2010 has been very successful to me as I have been awarded as Microsoft Most Valuable Professional for visual C#. And It was one of the greatest achievement of my life. I would like to take this opportunity to thanks Microsoft for this and thanks all friends specially Jacob Sebastian who has given me guidance any time I required it. I have been also awarded dzone most valuable blogger this year and it was a nice surprise from dzone. I would like thanks dzone for this. Once again I am wishing you happy new year and may this year will bring success to all of you. One more thing I have found that I have met lots of people who is quite intelligent and exceptional developers and IT professionals but they are not blogging their stuff. I would say please my blog post a why a developer should write blog and Start blogging immediately because unless and until you don’t blog community will not know what you are doing.  Till then happy blogging and programming ... Stay tuned for more..

    Read the article

  • How to update all the SSIS packages&rsquo; Connection Managers in a BIDS project with PowerShell

    - by Luca Zavarella
    During the development of a BI solution, we all know that 80% of the time is spent during the ETL (Extract, Transform, Load) phase. If you use the BI Stack Tool provided by Microsoft SQL Server, this step is accomplished by the development of n Integration Services (SSIS) packages. In general, the number of packages made ??in the ETL phase for a non-trivial solution of BI is quite significant. An SSIS package, therefore, extracts data from a source, it "hammers" :) the data and then transfers it to a specific destination. Very often it happens that the connection to the source data is the same for all packages. Using Integration Services, this results in having the same Connection Manager (perhaps with the same name) for all packages: The source data of my BI solution comes from an Helper database (HLP), then, for each package tha import this data, I have the HLP Connection Manager (the use of a Shared Data Source is not recommended, because the Connection String is wired and therefore you have to open the SSIS project and use the proper wizard change it...). In order to change the HLP Connection String at runtime, we could use the Package Configuration, or we could run our packages with DTLoggedExec by Davide Mauri (a must-have if you are developing with SQL Server 2005/2008). But my need was to change all the HLP connections in all packages within the SSIS Visual Studio project, because I had to version them through Team Foundation Server (TFS). A good scribe with a lot of patience should have changed by hand all the connections by double-clicking the HLP Connection Manager of each package, and then changing the referenced server/database: Not being endowed with such virtues :) I took just a little of time to write a small script in PowerShell, using the fact that a SSIS package (a .dtsx file) is nothing but an xml file, and therefore can be changed quite easily. I'm not a guru of PowerShell, but I managed more or less to put together the following lines of code: $LeftDelimiterString = "Initial Catalog=" $RightDelimiterString = ";Provider=" $ToBeReplacedString = "AstarteToBeReplaced" $ReplacingString = "AstarteReplacing" $MainFolder = "C:\MySSISPackagesFolder" $files = get-childitem "$MainFolder" *.dtsx `       | Where-Object {!($_.PSIsContainer)} foreach ($file in $files) {       (Get-Content $file.FullName) `             | % {$_ -replace "($LeftDelimiterString)($ToBeReplacedString)($RightDelimiterString)", "`$1$ReplacingString`$3"} ` | Set-Content $file.FullName; } The script above just opens any SSIS package (.dtsx) in the supplied folder, then for each of them goes in search of the following text: Initial Catalog=AstarteToBeReplaced;Provider= and it replaces the text found with this: Initial Catalog=AstarteReplacing;Provider= I don’t enter into the details of each cmdlet used. I leave the reader to search for these details. Alternatively, you can use a specific object model exposed in some .NET assemblies provided by Integration Services, or you can use the Pacman utility: Enjoy! :) P.S. Using TFS as versioning system, before running the script I checked out the packages and, after the script executed succesfully, I checked in them.

    Read the article

  • Replacement for Vern Buerg's list.com in 64 bit Windows 7

    - by Kevin
    I would like to find a replacement for list.com, specifically the ability to accept piped input. For example: p4 sync -n | list which accepts the output of the perforce command and displays the results in the viewer/editor for manipulation or saving. I know that I would send the output to a file and then open the file in the viewer/editor but I use it for temporary results. List.com doesn't work on 64 bit Windows 7.

    Read the article

  • how to suppress output in R

    - by Derek
    Hi, I would like to suppress output in R when I run my r script from the command prompt. I tried numerous options including "--slave" and "--vanilla". Those options lessens the amount of text outputted. I also tried to pipe the output to "NUL" but that didn't help. Thanks a lot, Derek

    Read the article

  • PS using Get-WinEvent with FilterXPath and datetime variables?

    - by Jordan W.
    I'm grabbing a handful of events from an event log in chronological order don't want to pipe to Where want to use get-winevent After I get the Event1, I need to get the 1st instance of another event that occurs some unknown amount of time after Event1. then grab Event3 that occurs sometime after Event2 etc. Basically starting with: $filterXML = @' <QueryList> <Query Id="0" Path="System"> <Select Path="System">*[System[Provider[@Name='Microsoft-Windows-Kernel-General'] and (Level=4 or Level=0) and (EventID=12)]]</Select> </Query> </QueryList> '@ $event1=(Get-WinEvent -ComputerName $PCname -MaxEvents 1 -FilterXml $filterXML).timecreated Give me the datetime of Event1. Then I want to do something like: Get-WinEvent -LogName "System" -MaxEvents 1 -FilterXPath "*[EventData[Data = 'Windows Management Instrumentation' and TimeCreated -gt $event1]]" Obviously the timecreated part bolded there doesn't work but I hope you get what I'm trying to do. any help?

    Read the article

  • Calling via adb in Power shell

    - by Imran Nasir
    As you may know, the command for calling via adb is: .\adb.exe shell am start -a android.intent.action.CALL tel:"656565" This works well but when I use textbox, it takes garbage value... .\adb.exe shell am start -a android.intent.action.CALL tel:$textbox1.Text I have tried this also but failed $button21_Click={ #TODO: Place custom script here $textbox1.Clear .\adb.exe shell am start -a android.intent.action.CALL tel:$textbox1.Text } Please help

    Read the article

  • Can I get Raid disk status by using PS?

    - by David.Chu.ca
    I have a HP server with Raid 5. Port 0 and 1 are used for data & OS mirroring. The software come with the Raid 5 is Intel Matrix Storage Manager and there is manager console as windows based api to view all the ports, including their status. Now they are all in Normal status. I am not sure if the OS/Windows has some APIs or .Net classes to access raid ports and get their status? If so, how can I use PS to get the information? Do I have to reference to the dlls provided by the Intel Matrix Storage Manager if not? Basically, I would like to write a PS script to get read status. In case any of port disk is not normal, a message will be sent out by growl protocol.

    Read the article

  • Start-Job Problems

    - by laerte-junior
    Hi all, Why this code not works ? function teste { begin { function lala { while ($true){ "JJJJ" | Out-File c:\Testes\teste.txt -Append } } } process { Start-Job -ScriptBlock {lala} } }

    Read the article

  • WMI Query Script as a Job

    - by Kenneth
    I have two scripts. One calls the other with a list of servers as parameters. The second query is designed to execute a WMI query. When I run it manually, it does this perfectly. When I try to run it as a job it hangs forever and I have to remove it. For the sake of space here is the relevant part of the calling script: ProcessServers.ps1 Start-Job -FilePath .\GetServerDetailsLight.ps1 -ArgumentList $sqlsrv,$destdb,$server,$instance GetServerDetailsLight.ps1 param($sqlsrv,$destdb,$server,$instance) $password = get-content C:\SQLPS\auth.txt | convertto-securestring $credentials = new-object -typename System.Management.Automation.PSCredential -argumentlist "DOMAIN\MYUSER",$password [System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMO') $box_id = 0; if ($sqlsrv.length -eq 0) { write-output "No data passed" break } function getinfo { param( [string]$svr, [string]$inst ) "Entered GetInfo with: $svr,$inst" $cs = get-wmiobject win32_operatingsystem -computername $svr -credential $credentials -authentication 6 -Verbose -Debug | select Name, Model, Manufacturer, Description, DNSHostName, Domain, DomainRole, PartOfDomain, NumberOfProcessors, SystemType, TotalPhysicalMemory, UserName, Workgroup write-output "WMI Results: $cs" } getinfo $server $instance write-output "Complete" Executed as a job it will show as 'running' forever: PS C:\sqlps> Start-Job -FilePath .\GetServerDetailsLight.ps1 -ArgumentList DBSERVER,LOGDB,SERVER01,SERVER01 Id Name State HasMoreData Location Command -- ---- ----- ----------- -------- ------- 21 Job21 Running True localhost param($sqlsrv,$destdb,... GAC Version Location --- ------- -------- True v2.0.50727 C:\WINDOWS\assembly\GAC_MSIL\Microsoft.SqlServer.Smo\10.0.0.0__89845dcd8080cc91\Microsoft.SqlServer.Smo.dll getinfo MSDCHR01 MSDCHR01 Entered GetInfo with: SERVER01,SERVER01 The last output I ever get is the 'Entered GetInfo with: SERVER01,SERVER01'. If I run it manually like so: PS C:\sqlps> .\GetServerDetailsLight.ps1 DBSERVER LOGDB SERVER01 SERVER01 The WMI query executes just as expected. I am trying to determine why this is, or at least a useful way to trap errors from within jobs. Thanks!

    Read the article

  • tabexpansion doesn't fall through if overridden

    - by guillermooo
    The tabexpansion function only works partially when I override it like so: function tabexpansion { param($line, $lastWord) if ($line -eq "hey ") { "you", "Joe" } } The custom completions work as expected, but now I only get the default autocomplete behavior for cmdlet names, not parameters. So New-TAB works fine, but New-Alias -TAB doesn't. How do I get the regular completions too after overriding tabexpansion?

    Read the article

  • What's the environment variable for the path to the desktop?

    - by Scott Langham
    I'm writing a Windows batch file and want to copy something to the desktop. I think I can use this: %UserProfile%\Desktop\ However, I'm thinking, that's probably only going to work on an English OS. Is there a way I can do this in a batch file that will work on any internationalized version? UPDATE I tried the following batch file: REG QUERY "HKCU\Software\Microsoft\Windows\CurrentVersion\Explorer\User Shell Folders" /v Desktop FOR /F "usebackq tokens=3 skip=4" %%i in (`REG QUERY "HKCU\Software\Microsoft\Windows\CurrentVersion\Explorer\User Shell Folders" /v Desktop`) DO SET DESKTOPDIR=%%i FOR /F "usebackq delims=" %%i in (`ECHO %DESKTOPDIR%`) DO SET DESKTOPDIR=%%i ECHO %DESKTOPDIR% And got this output: S:\REG QUERY "HKCU\Software\Microsoft\Windows\CurrentVersion\Explorer\User Shell Folders" /v Desktop HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\Explorer\User Shell Folders Desktop REG_EXPAND_SZ %USERPROFILE%\Desktop S:\FOR /F "usebackq tokens=3 skip=4" %i in (`REG QUERY "HKCU\Software\Microsoft\Windows\CurrentVersion\Explorer\User Shell Folder s" /v Desktop`) DO SET DESKTOPDIR=%i S:\FOR /F "usebackq delims=" %i in (`ECHO ECHO is on.`) DO SET DESKTOPDIR=%i S:\SET DESKTOPDIR=ECHO is on. S:\ECHO ECHO is on. ECHO is on.

    Read the article

  • Setting ForceCheckout on an SPList

    - by pk
    I'm trying to set the ForceCheckout property on an SPList item and it's just not taking. I'm calling the Update() command as required. All it should take, in essence, is the following two lines. $myList.ForceCheckout = $false $myList.Update() Any ideas why this isn't working? It's remains $true no matter what.

    Read the article

  • What is Get-Command -CommandType Script?

    - by Roman Kuzmin
    What is Get-Command -CommandType Script? What kind of commands does it cover? Help about Script tells “-- Script: Script blocks in the current session.” but I am not sure what kind of script blocks it means. Are there cases when Get-Command -CommandType Script actually returns anything?

    Read the article

  • Get Active Directory Attributes for Users on Legacy Exchange Servers

    - by Jason Hindson
    I would like to create a CSV file of the users on our Exchange 2003 servers, and include some attributes from their AD account. In particular, I would like to pull certain AD values for the users with RecipientTypeDetails = LegacyMailbox. I have tried a few different methods for targeting and filtering (ldapfilter, filter, objectAttribute, etc.) these users, with little success. The Exchange 2003 PowerPack for PowerGUI was helpful, but permissions issues and using the Exchange_Mailbox class are not challenges I want to overcome. I was finally able to create a working script, but it is very slow. The script I've created below is currently working, although it is on track to take about 4+ hours to complete. I'm am looking for suggestions for improving the efficiency of my script or otherwise obtaining this data in a quicker manner. Here is the script: $ADproperties = 'City','Company','department','Description','DistinguishedName','DisplayName','FirstName','l','LastName','msExchHomeServerName','NTAccountName','ParentContainer','physicaldeliveryofficename','SamAccountName','useraccountcontrol','UserPrincipalName' get-user -ResultSize Unlimited -ignoredefaultscope -RecipientTypeDetails LegacyMailbox | foreach {Get-QADUser $_.name -DontUseDefaultIncludedProperties -IncludedProperties $ADproperties} | select $ADproperties | epcsv C:\UserListBuilder\exchUsers.csv -notype Any help you can provide will be greatly appreciated!

    Read the article

  • how to get get-item cmdlet's output to variable as string

    - by aeon
    i mean when i call get-item with directory it dump to console like this ---- ------------- ------ ---- d---- 2/16/2011 8:27 PM 2011-2-16 -a--- 2/13/2011 8:24 PM 3906877184 SWP-Full Database Backup_2011-02-13 0 -a--- 2/16/2011 8:23 PM 3919766476 SWP-Full Database Backup_2011-02-16.bak 8 -a--- 2/12/2011 8:18 PM 3906877747 SWP-Full Database Backup_2011-02-12 2 -a--- 2/14/2011 8:21 PM 3875484467 SWP-Full Database Backup_2011-02-14 2 but when i convert to string it changes as \\192.168.2.89\BwLive\2011-2-16 \\192.168.2.89\BwLive\SWP-Full Database Backup_2011-02-13 \\192.168.2.89\BwLive\SWP-Full Database Backup_2011-02-16.bak \\192.168.2.89\BwLive\SWP-Full Database Backup_2011-02-12 \\192.168.2.89\BwLive\SWP-Full Database Backup_2011-02-14 i mean length,size,time attributes is omitted how can i keep these attributes while converting to string? thanks.

    Read the article

  • Problem with Copy-Item -force

    - by andemt
    Hi, This is part of my image-copy-script: Get-Childitem | where {$.Extension -eq ".png" -or $.Extension -eq ".gif" -or $_.Extension -eq ".jpg"} | Copy-Item -destination $dest -force It works fine, and it can overwrite files. Well, it can overwrite if the existing file have attribute R or A. Not when its blanked. Error in red text: "Copy-Item : The file '\server\d$\path\thumbs\007487l.jpg' already exists." Why?

    Read the article

  • How to check for exception further down a pipeline

    - by laertejuniordba
    Hi all I am using New-Object System.Management.Automation.PipelineStoppedException To stop the pipeline OK..works.. But how Can I test in the next cmdlet if was stopped ? Foo1 | foo2 | foo3 Stop in foo1, but goes to foo2. I want to test if was stopped in foo1 to stop too in each function function foo1 { process { try { #do } catch { New-Object System.Management.Automation.PipelineStoppedException } } } and still going to the next cmdlet, as stopped by error I want to stop in each next cmdlets..:) Thanks !!!

    Read the article

  • Get-ChildItem to Move-Item - path not found

    - by Filburt
    I try to move my old logfiles to a yyyy\MM\dd folder structure by Get-ChildItem . -Recurse -Include *.log | Move-Item -Dest {"D:\Archive\{0:yyyy\\MM\\dd}\{1}" -f $_.LastWriteTime, $_.Name} -Force but i get a path-not-found error. update I suspect the problem originates from the fact that the source path contains Program Files. sub question: Could the same be done without Get-ChildItem?

    Read the article

  • I want to copy all the files available in my TFS source server to a folder in a directory.I tried th

    - by deep
    PS> C:\Windows\System32> Get-TfsItemProperty $/MyFirstTFSProj -r ` -server xyzc011b| Where {$_.CheckinDate -gt (Get-Date).AddDays(-150)} | Copy-Item D:\john\application1 -Destination C:\Test -whatif Copy-Item : The input object cannot be bound to any parameters for the command either because the command does not take pipeline input or the input and its pr operties do not match any of the parameters that take pipeline input. At line:2 char:14 + Copy-Item <<<< D:\Deepu\SilverlightApplication5 -Destination C:\Test -w hatif

    Read the article

< Previous Page | 94 95 96 97 98 99 100 101 102 103 104 105  | Next Page >