Search Results

Search found 1743 results on 70 pages for 'powershell'.

Page 35/70 | < Previous Page | 31 32 33 34 35 36 37 38 39 40 41 42  | Next Page >

  • SMO ConnectionContext.StatementTimeout setting is ignored

    - by Woody
    I am successfully using Powershell with SMO to backup most databases. However, I have several large databases in which I receive a "timeout" error "System.Data.SqlClient.SqlException: Timeout expired". The timout consistently occurs at 10 minutes. I have tried setting ConnectionContext.StatementTimeout to 0, 6000, and to [System.Int32]::MaxValue. The setting made no difference. I have found a number of Google references which indicate setting it to 0 makes it unlimited. No matter what I try, the timeouts consistently occur at 10 minutes. I even set Remote Query Timeout on the server to 0 (via Studio Manager) to no avail. Below is my SMO connection where I set the time out and the actual backup function. Further below is the output from my script. UPDATE Interestingly enough, I wrote the backup function in C# using VS 2008 and the timeout override does work within that environment. I am in the process of incorporating that C# process into my Powershell Script until I can find out why the timeout override does not work with just Powershell. This is extremely annoying! function New-SMOconnection { Param ($server, $ApplicationName= "PowerShell SMO", [int]$StatementTimeout = 0 ) # Write-Debug "Function: New-SMOconnection $server $connectionname $commandtimeout" if (test-path variable:\conn) { $conn.connectioncontext.disconnect() } else { $conn = New-Object('Microsoft.SqlServer.Management.Smo.Server') $server } $conn.connectioncontext.applicationName = $applicationName $conn.ConnectionContext.StatementTimeout = $StatementTimeout $conn.connectioncontext.Connect() $conn } $smo = New-SMOConnection -server $server if ($smo.connectioncontext.isopen -eq $false) { Throw "Could not connect to server $($server)." } Function Backup-Database { Param([string]$dbname) $db = $smo.Databases.get_Item($dbname) if (!$db) {"Database $dbname was not found"; Return} $sqldir = $smo.Settings.BackupDirectory + "\$($smo.name -replace ("\\", "$"))" $s = ($server.Split('\'))[0] $basedir = "\\$s\" + $($sqldir -replace (":", "$")) $dt = get-date -format yyyyMMdd-HHmmss $dbbk = new-object ('Microsoft.SqlServer.Management.Smo.Backup') $dbbk.Action = 'Database' $dbbk.BackupSetDescription = "Full backup of " + $dbname $dbbk.BackupSetName = $dbname + " Backup" $dbbk.Database = $dbname $dbbk.MediaDescription = "Disk" $target = "$basedir\$dbname\FULL" if (-not(Test-Path $target)) { New-Item $target -ItemType directory | Out-Null} $device = "$sqldir\$dbname\FULL\" + $($server -replace("\\", "$")) + "_" + $dbname + "_FULL_" + $dt + ".bak" $dbbk.Devices.AddDevice($device, 'File') $dbbk.Initialize = $True $dbbk.Incremental = $false $dbbk.LogTruncation = [Microsoft.SqlServer.Management.Smo.BackupTruncateLogType]::Truncate If (!$copyonly) { If ($kill) {$smo.KillAllProcesses($dbname)} $dbbk.SqlBackupAsync($server) } $dbbk } Started SQL backups for server LCFSQLxxx\SQLxxx at 05/06/2010 15:33:16 Statement TimeOut value set to 0. DatabaseName : OperationsManagerDW StartBackupTime : 5/6/2010 3:33:16 PM EndBackupTime : 5/6/2010 3:43:17 PM StartCopyTime : 1/1/0001 12:00:00 AM EndCopyTime : 1/1/0001 12:00:00 AM CopiedFiles : Status : Failed ErrorMessage : System.Data.SqlClient.SqlException: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding. The backup or restore was aborted. 10 percent processed. 20 percent processed. 30 percent processed. 40 percent processed. 50 percent processed. 60 percent processed. 70 percent processed. at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection) at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection) at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj) at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj) at System.Data.SqlClient.SqlCommand.RunExecuteNonQueryTds(String methodName, Boolean async) at System.Data.SqlClient.SqlCommand.InternalExecuteNonQuery(DbAsyncResult result, String methodName, Boolean sendToPipe) at System.Data.SqlClient.SqlCommand.ExecuteNonQuery() at Microsoft.SqlServer.Management.Common.ServerConnection.ExecuteNonQuery(String sqlCommand, ExecutionTypes executionType) Ended backups at 05/06/2010 15:43:23

    Read the article

  • How can I set IIS Application Pool recycle times without resorting to the ugly syntax of Add-WebConfiguration?

    - by ObligatoryMoniker
    I have been scripting the configuration of our IIS 7.5 instance and through bits and pieces of other peoples scripts I have come up with a syntax that I like: $WebAppPoolUserName = "domain\user" $WebAppPoolPassword = "password" $WebAppPoolNames = @("Test","Test2") ForEach ($WebAppPoolName in $WebAppPoolNames ) { $WebAppPool = New-WebAppPool -Name $WebAppPoolName $WebAppPool.processModel.identityType = "SpecificUser" $WebAppPool.processModel.username = $WebAppPoolUserName $WebAppPool.processModel.password = $WebAppPoolPassword $WebAppPool.managedPipelineMode = "Classic" $WebAppPool.managedRuntimeVersion = "v4.0" $WebAppPool | set-item } I have seen this done a number of different ways that are less terse and I like the way this syntax of setting object properties looks compared to something like what I see on TechNet: Set-ItemProperty 'IIS:\AppPools\DemoPool' -Name recycling.periodicRestart.requests -Value 100000 One thing I haven't been able to figure out though is how to setup recycle schedules using this syntax. This command sets ApplicationPoolDefaults but is ugly: add-webconfiguration system.applicationHost/applicationPools/applicationPoolDefaults/recycling/periodicRestart/schedule -value (New-TimeSpan -h 1 -m 30) I have done this in the past through appcmd using something like the following but I would really like to do all of this through powershell: %appcmd% set apppool "BusinessUserApps" /+recycling.periodicRestart.schedule.[value='01:00:00'] I have tried: $WebAppPool.recycling.periodicRestart.schedule = (New-TimeSpan -h 1 -m 30) This has the odd effect of turning the .schedule property into a timespan until I use $WebAppPool = get-item iis:\AppPools\AppPoolName to refresh the variable. There is also $WebappPool.recycling.periodicRestart.schedule.Collection but there is no add() function on the collection and I haven't found any other way to modify it. Does anyone know of a way I can set scheduled recycle times using syntax consistent with the code I have written above?

    Read the article

  • Dynamically add Server 2008 NLB Nodes

    - by Nick Jacques
    Hi All, I have a small NLB cluster for Terminal Servers. One of the things we're looking at doing for this particular project (this is for a college class) is dynamically creating Terminal Servers. What we've done is create policies for a certain OU, that sets the proper TS Farm properties and installs the Terminal Server role and NLB feature. Now what we'd like to do is create a script to be run on our Domain Controller to add hosts to the preexisting NLB cluster. On our Server 2008 R2 Domain Controller, I was thinking of running the following PowerShell script I've kind of hacked together. Any thoughts on if this will work? Is there any way I can trigger this script to run on the DC once all the scripts to install roles are done on the various Terminal Servers? Thanks very much in advance!! Import-Module NetworkLoadBalancingClusters $TermServs = @() $Interface = "Local Area Connection" $ou = [ADSI]"LDAP://OU=Term Servs,DC=example,DC=com" foreach ($child in $ou.psbase.Children) { if ($child.ObjectCategory -like '*computer*') {$TermServs += $child.Name} } foreach ($TS in $TermServs) { Get-NlbCluster 172.16.0.254 | Add-NlbClusterNode -NewNodeName $TS -NewNodeInterface $Interface }

    Read the article

  • How do I increase maximum attachment size in Exchange 2007 SP1?

    - by AspNyc
    I've been looking all over for a relatively simple answer to a fairly straightforward question: "how do I increase the maximum size of attachments that can be sent and/or received in Exchange 2007?". But I have yet to find a solution that works. We have a pretty straightforward setup: Exchange 2007 SP1 running on a single server, with the OWA role delegated to a second server. We did a clean install of Exchange 2007 a year or two ago: we did not upgrade from a previous version. I forget if we installed RTM and then patched it to SP1, or if we installed with SP1 already baked in. I just thought I'd mention those items, in case they influence the answer. So far, I've tried running the following Powershell commands on the main Exchange server and verified that they've taken effect: Set-TransportConfig -MaxReceiveSize 40MB Set-ReceiveConnector "RcvConnector" -MaxMessageSize 40MB Set-MaxReceiveSize "MailboxName" -MaxReceiveSize 40MB As of right now, though, the specified mailbox is still rejecting messages over 10MB. You get brownie points if you can also tell me how to set the default mailbox attachment size limits, so that new accounts don't have default Set-MaxReceiveSize values of "unlimited" they currently do. Any advice or suggestions would be greatly appreciated. Tx in advance!

    Read the article

  • Loop through several servers, find specific dlls , get the dll version, internal filename and path?

    - by Graham
    I am a newby to Powershell, and using PS v2. I can see the massive potential it has, but I just can't get the following code to work fully. I am trying to end up with a csv file that contains the wild carded required dlls in the GAC_MSIL or sub-directory, get the dll version, internal filename and path, and the server IP address. The code is below, and it is in single line format because it appears easier to remote onto one of the servers in the server farm and run the single line from that console, ue to security log-ins etc. The code has produced a set of results, but only for the last server, it probably does the first server, then overwrites it but I am not sure about that. I have done a lot of reading about using arrays, and custom objects, and had a go at doing that, but my scripting skills in PS are not yet up to it. Code: $out = "Ouput_dll_ver_results.csv";foreach ($server in '11.222.33.123', '11.222.33.124') {$VersionInfo = (Get-ChildItem \$server\C$\windows\assembly\GAC_MSIL -recurse -Include abc*.dll,def*.dll,ghi*.dll,jkl*.dll | Where-Object { $.FullName -notmatch "\windows\assembly\temp\" })}; $VersionInfo | %{Get-Command $.FullName} | select -expand File* |Export-Csv $out Can you please advise if/how the above code can be corrected, and if not, what alternatives do I have to get the information I need. Many thanks in advance. Graham

    Read the article

  • How can I both pipe and display output in Windows' command line?

    - by Bob
    I have a process I need to run within a batch file. This process produces some output. I need to both display this output to the screen and send (pipe) it to another program. The bash method uses tee: echo 'ee' | tee /dev/tty | foo Is there an equivalent for Windows? I am happy to use PowerShell if necessary. There are tee ports for Windows, but there does not appear to be an equivalent for /dev/tty, which complicates matters. The specific use-case here: I have a program (launch4j) that I need to run, displaying output to the user. At the same time, I need to be able to detect success or failure in the script. Unfortunately, this program does not set an exit code, and I cannot force it to do so. My current workaround involves piping to find, to search the output (launch4j config.xml | find "Successfully created") - however, that swallows the output I need to display. Therefore, I need some way to both display to the screen and send the ouput to a command - and this command should be able to set ERRORLEVEL (it cannot run asynchronously).

    Read the article

  • How do I make wallpaper fit both monitors in dual monitor setup?

    - by Ben
    I am deploying some custom corporate wallpaper as part of a Windows 7 rollout. Some people will be using dual monitors, and the additional monitors may be either 4:3 or widescreen. I want to use the same wallpaper on both screens (i.e. 2 copies of the same wallpaper, not stretched across both.) If I set the background to "stretch", it uses the aspect ratio of the primary monitor to stretch the wallpaper on both monitors. So, for example, if I have a dual monitor setup using a 4:3 TFT as primary and my (widescreen) laptop LCD as secondary - the image shows on the laptop LCD in 4:3, with a black stripe down either side. I've only noticed this as an issue with my "custom" wallpaper. Both the default MS wallpaper and the built in Lenovo wallpaper don't seem to have this issue. Is this by using "trickery" such as using an image larger than the largest resolution you will have and centering it? (i.e. so you crop out part of the image.) Or can this be done "properly"? I don't want to use 3rd party software to do this, but would happily do a bit of Powershell scripting if this would solve the issue. Thanks in advance, Ben

    Read the article

  • opath syntax to force dynamic distribution group field as numerical comparison? (Exchange 2010)

    - by Matt
    I'm upgrading a (working) query based group (Exchange 2003) to a new and 'improved' dynamic distribution group (2010). For better or worse, our company decided to store everyone's employee ID in the pager field, so it's easy to manipulate via ADUC. That employee number has significance, as all employees are in a certain range, and all contractors are in a very different range. Basically, the new opath syntax appears to be using string compare on my pager field, even though it's a number. Let's say my employee ID is 3004, well, it's "less than" 4 from a string check POV. Set-DynamicDistributionGroup -Identity "my-funky-new-group" -RecipientFilter "(pager -lt 4) -and (pager -like '*') -and (RecipientType -eq 'UserMailbox')" Shows up in EMC with this: ((((((Pager -lt '4') -and (Pager -ne $null))) -and (RecipientType -eq 'UserMailbox'))) -and (-not(Name -like 'SystemMailbox{*')) -and (-not(Name -like 'CAS_{*')) -and (-not(RecipientTypeDetailsValue -eq 'MailboxPlan')) -and (-not(RecipientTypeDetailsValue -eq 'DiscoveryMailbox')) -and (-not(RecipientTypeDetailsValue -eq 'ArbitrationMailbox'))) This group should have max of 3 members right? Nope - I get a ton because of the string compare. I show up, and I'm in the 3000 range. Question: Anyone know a clever way to force this to be an integer check? The read-only LDAP filter on this group looks good, but of course it can't be edited. The LDAP representation (look ma, no quotes on the 4!) - Also interesting it sort of 'fills the' bed with the (pager=4) thing... (&(pager<=4)(!(pager=4))(pager=*)(objectClass=user)(objectCategory=person)(mailNickname=*)(msExchHomeServerName=*)(!(name=SystemMailbox{*))(!(name=CAS_{*))!(msExchRecipientTypeDetails=16777216))(!(msExchRecipientTypeDetails=536870912))(!(msExchRecipientTypeDetails=8388608))) If there is no solution, I suppose my recourse is either finding an unused field that actually will be treated as an integer, or most likely building this list with powershell every morning with my own automation - lame. I know of a few ways to fix this outside of the opath filter (designate "full-time" in another field, etc.), but would rather exchange do the lifting since this is the environment at the moment. Any insight would be great - thanks! Matt

    Read the article

  • legitimacy of the tasks in the task scheduler

    - by Eyad
    Is there a way to know the source and legitimacy of the tasks in the task scheduler in windows server 2008 and 2003? Can I check if the task was added by Microsoft (ie: from sccm) or by a 3rd party application? For each task in the task scheduler, I want to verify that the task has not been created by a third party application. I only want to allow standards Microsoft Tasks and disable all other non-standards tasks. I have created a PowerShell script that goes through all the xml files in the C:\Windows\System32\Tasks directory and I was able to read all the xml task files successfully but I am stuck on how to validate the tasks. Here is the script for your reference: Function TaskSniper() { #Getting all the fils in the Tasks folder $files = Get-ChildItem "C:\Windows\System32\Tasks" -Recurse | Where-Object {!$_.PSIsContainer}; [Xml] $StandardXmlFile = Get-Content "Edit Me"; foreach($file in $files) { #constructing the file path $path = $file.DirectoryName + "\" + $file.Name #reading the file as an XML doc [Xml] $xmlFile = Get-Content $path #DS SEE: http://social.technet.microsoft.com/Forums/en-US/w7itprogeneral/thread/caa8422f-6397-4510-ba6e-e28f2d2ee0d2/ #(get-authenticodesignature C:\Windows\System32\appidpolicyconverter.exe).status -eq "valid" #Display something $xmlFile.Task.Settings.Hidden } } Thank you

    Read the article

  • Office365 Exchange: Cannot open shared two calendars in Outlook

    - by Mark Williams
    The problem: Outlook won't open the calendars on another user's mailbox and and a room mailbox, even when users have permission. Note: This problem is affecting more than one account on more than one machine. So I have a room mailbox and a personal mailbox on Exchange, both with shared calendars. There is a security group called "Scheduling Users" that have editor rights on both of these calenders. The room mailbox was created using PowerShell, per the instructions posted online (http://help.outlook.com/140/ee441202.aspx). Sharing worked on both of these folders initially. Users can still access these folders using OWA. So on to the problem. When users try to open these calendars in Outlook they receive one of the following messages. The set of folders cannot be opened. Microsoft Exchange is not available. Either there are network problems or the Exchange server is down for maintenance. Cannot open this item. Cannot open the free/busy information. The attempt to log on to Microsoft Exchange has failed. What I have tried so far: Resetting the permissions on both of the mailboxes. I deleted the security group permissions on both mailboxes, applied the change, then waited a bit and gave the permissions back. Deleted the OST file of the shared calendar from the Outlook data directory That is all I have been able to find online. Any thoughts? I have been going back and forth with the Office365 support folks for a while and they seem stumped too.

    Read the article

  • Exchange 2010 OWA- Open Other User Mailbox

    - by Benjamin Jones
    I just started working for this small firm (30 people) a little bit ago, replacing their System Admin. First thing I noticed was Exchange Server 2010 was WAY out of date. Believe it or not they did not have SP1 installed. So after I installed and configured Exchange 2010 SP3 and redirected OWA I noticed something in OWA. I could add ANYONE's User Mailbox WITHOUT giving mailbox premission. I created a couple test users, same thing. I even had another employee provide me access to their OWA and they could open anyone's Inbox without granting permission. I don't want to play the blame game, but I was SHOCKED that this was going on. Luckly being such a small company I'll be able to cover this mistake that I did not create, BUT HOW? My guess is that I need to find out where the past System Admin went wrong in providing Full Access Permission? Or could this be a Auto-Mapping issue? I found this article: http://technet.microsoft.com/en-us/library/hh529943.aspx This might work $FixAutoMapping = Get-MailboxPermission sharedmailbox |where {$_.AccessRights -eq "FullAccess" -and $_.IsInherited -eq $false} $FixAutoMapping | Remove-MailboxPermission $FixAutoMapping | ForEach {Add-MailboxPermission -Identity $_.Identity -User $_.User -AccessRights:FullAccess -AutoMapping $false} However how do I insert the above code into Powershell? Again I was thrown into this mess and I'm just trying to iron out this tangled mess.

    Read the article

  • RD Gateway reporting features/capabilities

    - by Don
    We have just implemented RD Gateway for our own department in preparation for a push to the whole agency for telecommuting. It is all setup and working great, but I was trying to figure out how best to go about monitoring/reporting of users. I see third party software out there that will do it, but is there anything built-in or via powershell/scripting that I could use that would give me a report of the daily activity of users? Something to say, "User A connected at this time, was on for this long, sent/received this much data"? Basically some of the same stuff you can see in event viewer. Ideally I'd like to be able to have this setup so that once a day it emails me with the daily usage for when a supervisor asks about if their person is actually working (or at least online sending and receiving x amount of data), I'll have some metrics to give them. I realize that actual work output is relevant and more of a managerial issue, but I would like to be able to offer as much as I can from my end when asked. Thoughts? Thanks!

    Read the article

  • script to list user's mapped drive not giving results or error

    - by user223631
    We are in the process of migrating two file servers to a new server. We have mapped drives via user group in group policy. Many users have manually mapped drives and we need to find these mappings. I have created a PowerShell script to run that remotely get the drive mappings. It works on most computers but there are many that are not returning results and I am not getting any error messages. Each workstation on the list creates a text file and the ones that are not returning results have no text in the files. I can ping these machines. If the machine is not turned on, it does come up error message that the RPC server is not available. My domain user account is in a group that is in the local admin account. I have no idea why some are not working. Here is the script. # Load list into variable, which will become an array of strings If( !(Test-Path C:\Scripts)) { New-Item C:\Scripts -ItemType directory } If( !(Test-Path C:\Scripts\Computers)) { New-Item C:\Scripts\Computers -ItemType directory } If( !(Test-Path C:\Scripts\Workstations.txt)) { "No Workstations found. Please enter a list of Workstations under Workstation.txt"; Return} If( !(Test-Path C:\Scripts\KnownMaps.txt)) { "No Mapping to check against. Please enter a list of Known Mappings under KnownMaps.txt"; Return} $computerlist = Get-Content C:\Scripts\Workstations.txt # Loop through each item in the array (each computer in the list of computers we loaded into the variable) ForEach ($computer in $computerlist) { $diskObject = Get-WmiObject Win32_MappedLogicalDisk -computerName $computer | Select Name,ProviderName | Out-File C:\Tester\Computers\$computer.txt -width 200 } Select-String -Path C:\Tester\Computers\*.txt -Pattern cmsfiles | Out-File C:\Tester\Drivemaps-all.txt $strings = Get-Content C:\Tester\KnownMaps.txt Select-String -Path C:\Tester\Drivemaps-all.txt -Pattern $strings -notmatch -simplematch | Out-File C:\Tester\Drivemaps-nonmatch.txt -Width 200 Select-String -Path C:\Tester\Drivemaps-all.txt -Pattern $strings -simplematch | Out-File C:\Tester\Drivemaps-match.txt -Width 200

    Read the article

  • what means parameter -mailboxcredenctial

    - by cotablise
    H3llo, I am writing regarding the Exchange powershell commands. When I want to use following cmdlets, I have to insert parameter -mailboxcredential Test-OwaConnectivity Test-OutlookWebServices Test-ImapConnectivity Test-PopConnectivity In the Microsoft official site is written: "The MailboxCredential parameter specifies the mailbox credential for a single URL test." I am not sure why this parameter is needed... I inserted incorrect credentials, however the command was finished successfully... Could you tell me reason why this parameter is needed ? Example: Wrong/incorrect credential [PS] C:\>Test-WebServicesConnectivity -ClientAccessServer EXhub1 -MailboxCredential (Get-Credential blablabla) CasServer LocalSite Scenario Result Latency(MS) Error --------- --------- -------- ------ ----------- ----- EXhub1 Default-Fi... GetFolder Failure [System.Net.WebExcept... Without parameter: [PS] C:\>Test-WebServicesConnectivity -ClientAccessServer EXhub1 WARNING: Test user 'extest_91ef41d34eef4' isn't accessible, so this cmdlet won't be able to test Client Access server connectivity. Could not find or sign in with user ********\extest_91ef41d34eef4. If this task is being run without credentials, sign in as a Domain Administrator, and then run Scripts\new-TestCasConnectivityUser.ps1 to verify that the user exists on Mailbox server EXHUB1.****** + CategoryInfo : ObjectNotFound: (:) [Test-WebServicesConnectivity], CasHealthCouldN...edInfoException + FullyQualifiedErrorId : FB9A14B6,Microsoft.Exchange.Monitoring.TestWebServicesConnectivity WARNING: No Client Access servers were tested. Thank you in advance

    Read the article

  • Anyone have a script to delete a specific local windows profile?

    - by Jordan Weinstein
    I'm looking for Powershell (preferred) script, or .CMD or .VBS, to delete a specific user profile on a workstation (WinXP) or terminal server (2000, '03 or '08). I know all about the delprof utility... That only allows you delete based on a period of inactivity. I want a script to: prompt admin for a username delete that username's profile and to delete the entire profile - registry hive too (not just the folder structure within Documents and Settings). The same way it would if you went to My Computer Properties Advanced tab User Profiles Settings and deleted profiles from there. Any ideas? All I can think of is doing an AD lookup to get the SID of the user specified, then using that to delete the correct registry hive too... something simpler would be nice though... Basically, my HelpDesk used to be local administrators on our Citrix servers and a common fix for various issues was for them to delete a user's profile on the citrix server(s) and have that user log back in - voila, whatever issue they had was resolved. Going forward, in new Citrix environment, they will no longer be local admins on those boxes, but still need to be able to delete profiles (deleting the entire profile: folder and reg hive is key). thanks.

    Read the article

  • Expire Files In A Folder: Delete Files After x Days

    - by Brett G
    I'm looking to make a "Drop Folder" in a windows shared drive that is accessible to everyone. I'd like files to be deleted automagically if they sit in the folder for more than X days. However, it seems like all methods I've found to do this, use the last modified date, last access time, or creation date of a file. I'm trying to make this a folder that a user can drop files in to share with somebody. If someone copies or moves files into here, I'd like the clock to start ticking at this point. However, the last modified date and creation date of a file will not be updated unless someone actually modifies the file. The last access time is updated too frequently... it seems that just opening a directory in windows explorer will update the last access time. Anyone know of a solution to this? I'm thinking that cataloging the hash of files on a daily basis and then expiring files based on hashes older than a certain date might be a solution.... but taking hashes of files can be time consuming. Any ideas would be greatly appreciated! Note: I've already looked at quite a lot of answers on here... looked into File Server Resource Monitor, powershell scripts, batch scripts, etc. They still use the last access time, last modified time or creation time... which, as described, do not fit the above needs.

    Read the article

  • How to get physical partition name from iSCSI details on Windows?

    - by Barry Kelly
    I've got a piece of software that needs the name of a partition in \Device\Harddisk2\Partition1 style, as shown e.g. in WinObj. I want to get this partition name from details of the iSCSI connection that underlies the partition. The trouble is that disk order is not fixed - depending on what devices are connected and initialized in what order, it can move around. So suppose I have the portal name (DNS of the iSCSI target), target IQN, etc. I'd like to somehow discover which volumes in the system relate to it, in an automated fashion. I can write some PowerShell WMI queries that get somewhat close to the desired info: PS> get-wmiobject -class Win32_DiskPartition NumberOfBlocks : 204800 BootPartition : True Name : Disk #0, Partition #0 PrimaryPartition : True Size : 104857600 Index : 0 ... From the Name here, I think I can fabricate the corresponding name by adding 1 to the partition number: \Device\Harddisk0\Partition1 - Partition0 appears to be a fake partition mapping to the whole disk. But the above doesn't have enough information to map to the underlying physical device, unless I take a guess based on exact size matching. I can get some info on SCSI devices, but it's not helpful in joining things up (iSCSI target is Nexenta/Solaris COMSTAR): PS> get-wmiobject -class Win32_SCSIControllerDevice __GENUS : 2 __CLASS : Win32_SCSIControllerDevice ... Antecedent : \\COBRA\root\cimv2:Win32_SCSIController.DeviceID="ROOT\\ISCSIPRT\\0000" Dependent : \\COBRA\root\cimv2:Win32_PnPEntity.DeviceID="SCSI\\DISK&VEN_NEXENTA&PROD_COMSTAR... Similarly, I can run queries like these: PS> get-wmiobject -namespace ROOT\WMI -class MSiSCSIInitiator_TargetClass PS> get-wmiobject -namespace ROOT\WMI -class MSiSCSIInitiator_PersistentDevices These guys return information relating to my iSCSI target name and the GUID volume name respectively (a volume name like \\?\Volume{guid-goes-here}), but the GUID volume name is no good to me, and there doesn't appear to be a reliable correspondence between the target name and the volume that I can join on. I simply can't find an easy way of getting from an IQN (e.g. iqn.1992-01.com.example:storage:diskarrays-sn-a8675309) to physical partitions mapped from that target. The way I do it by hand? I start Disk Management, and look for a partition of the correct size, verify that its driver says NEXENTA COMSTAR, and look at the disk number. But even this is unreliable if I have multiple iSCSI volumes of the exact same size. Any suggestions?

    Read the article

  • OpenRemoteBaseKey() credentials

    - by sgibbons
    I'm attempting to use powershell to access a remote registry like so: $reg = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey("LocalMachine", $server) $key = $reg.OpenSubkey($subkeyPath) Depending on some factors that I'm not yet able to determine I either get Exception calling "OpenSubKey" with "1" argument(s): "Requested registry access is not allowed." Or System.UnauthorizedAccessException: Attempted to perform an unauthorized operation. at Microsoft.Win32.RegistryKey.Win32ErrorStatic(Int32 errorCode, String str) at Microsoft.Win32.RegistryKey.OpenRemoteBaseKey(RegistryHive hKey, String machineName) It seems pretty clear that this is because the user I'm running the powershell script as doesn't have the appropriate credentials to access the remote registry. I'd like to be able to supply a set of credentials to use for the remote registry access, but I can find no documentation anywhere of a way to do this. I'm also not clear on exactly where to specify which users are allowed to access the registry remotely.

    Read the article

  • Make Excel Defined Names within a worksheet to be global

    - by idazuwaika
    Hi, I wrote Powershell script to copy a worksheet from a workbook A to another workbook B. The worksheet contains define names for ranges within that sheet. Originally, the defined names are global in workbook A, ie. can be referenced from any worksheets within workbook A. But now, after copy to worksheet B, the defined names are limited to that worksheet only. How to I programmatically (via Powershell script preferably) make all those named range global i.e. can be referenced from all worksheets within workbook B. Some codes for clarity. #Script to update SOP from 5.1 to 5.2 $missing = [System.Type]::missing #Open files $excel = New-Object -Com Excel.Application $excel.Visible = $False $excel.DisplayAlerts = $False $newTemplate = "C:\WorkbookA.xls" $wbTemplate = $excel.Workbooks.Open($newTemplate) $oldSop = "C:\WorkbookB.xls" $wbOldSop = $excel.Workbooks.Open($oldSop) #Delete 'DATA' worksheet from old file $wsOldData = $wbOldSop.Worksheets.Item("DATA") $wsOldData.Delete() #Copy new 'DATA' worksheet to old file $wbTemplate.Worksheets.Item("DATA").Copy($missing,$wbOldSop.Worksheets.Item("STATUS")) #Save $wbOldSop.Save() $wbOldSop.Close() #Quit Excel $excel.Quit()

    Read the article

  • Console Utility that can remote connect to other systems and issue commands

    - by vivekeviv
    Hi Frequenlty my work involves to VNC to a remote system and work on it. SInce i run command line apps on this remote system most of the time, i was wondering if there's an alternative software to command prompt which i could install on my local machine. Using this i should be able to create a session with the remote system and from then on all commands issued in command prompt should run in remote system. localHostdir -- should list the directory contents in remotehost active directory localhostapp.exe should run app.exe in remote host and display its contents in localhost command prompt I browsed a little and read about cmdlets in powershell. But it looks like i need to write a cmdlet for each app in the path (dir, mkdir, app.exe in the path). Correct me if am wrong. Once session is established, i simply need the commands invoked in local host to be run in the remote host and return the console output to local host. Please let me know if powershell + cmdlets are the only way THanks

    Read the article

  • How to prevent command/script from changing global environment

    - by guillermooo
    I need to run scriptblocks/scripts from the current top-level shell and I want them to leave the global environment unmodified. So far, I've only been able to think of the following possibilities: powershell -file <script> powershell -noprofile -command <scriptblock> The problem is, that they are very slow. For instance, I would like to be able to do: mkdir newdir cd newdir $env:NEW_VAR = 100 ni -item f 'newfile.txt' ...so that my shell's working dir wouldn't change and $env:NEW_VAR wouldn't be set in the global environment. Are there any more alternatives to accomplish this?

    Read the article

  • C# - Ensuring an assembly is called via a specified assembly

    - by Adam Driscoll
    Is there any built in functionality to determine if an assembly is being called from a particular assembly? I have assembly A which references assembly B. Assembly A exposes PowerShell cmdlets and outputs types that are found within B. Certain methods and properties with in types of exposed by B are of interest to types in assembly A but not of interest to consumers of PowerShell or anyone attempting to load types in B directly and call methods within it. I have looked into InternalsVisibleToAttribute but it would require extensive rework because of the use of interfaces. I was devising a shared key system that would later be obfuscated but that seemed clunky. Is there any way to ensure B is called only by A?

    Read the article

  • copying files from one windows server to another

    - by Saju Pillai
    I have to copy a file from a windows 2008 server to one or more windows 2008 servers. I have accounts on the target machines with enough privileges to let me use powershell remoting and use wmi. The remote machines do not run ftp, ssh or similar file transfer mechanisms. I am not allowed to install software or run new services on the target server. I can run services on the source server. The file copy action must be initiated from the source server. i.e. I cannot manually logon to the target machines and initiate the copy - though an automated way to do this is acceptable. Is it possible to use WMI or PowerShell Remoting to push or pull the file from the source to the target ? Is it possible to invoke some sort of built in http client or invoke the BITS service/agent on the remote servers to pull files from the source server ? Other suggestions please.

    Read the article

< Previous Page | 31 32 33 34 35 36 37 38 39 40 41 42  | Next Page >