Search Results

Search found 20515 results on 821 pages for 'wmi service'.

Page 5/821 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Process-to-port mapping with SNMP and/or wmi/wmic in java

    - by Niddy888
    I'm trying to use SNMP to map outgoing ports on my host computer with the application running on the computer that is responsible for that communication. When running "netstat -ano" I get access to Protocol, Local Address (with port), Foreign Address (with port), State and PID. But I want to do this entirely without having to execute "cmd" from Java. By using SNMP OID: .1.3.6.1.2.1.25.4 (.iso.org.dod.internet.mgmt.mib-2.host.hrSWRun) I get access to PID (ex. 1704), Name (ex. cmd.exe), Path (ex. C:\Windows\system32) among others. There is an SNMP OID: .1.3.6.1.2.1.6.13 (.iso.org.dod.internet.mgmt.mib-2.tcp.tcpConnTable) that give you access to TCP connection state, local address, local port, remote address, remote port. But NO PID. So to sum up. My question again: Is there a way to "map" these tables together? Either directly in SNMP with other OID's or in conjunction with WMI / WMIC?

    Read the article

  • Opening my application when a usb device is inserted on Windows using WMI

    - by rsteckly
    Hi, I'm trying to launch an event when someone plugs in a usb device. For now, I'm content to simply print something to the console (in the finished product, it will launch an application). This code is very loosely adapted from: http://serverfault.com/questions/115496/use-wmi-to-detect-a-usb-drive-was-connected-regardless-of-whether-it-was-mounted There's two problems: 1) I need to pass the argument to Management scope dynamically because this will be installed on computers I don't use or whose name I don't know. 2) I'm getting an invalid namespace exception when I call w.Start(); Any ideas what I'm doing wrong? static ManagementEventWatcher w=null; static void Main(string[] args) { AddInstUSBHandler(); for(;;); } public static void USBRemoved(object sneder, EventArgs e) { Console.WriteLine("A USB device inserted"); } static void AddInstUSBHandler() { WqlEventQuery q; ManagementScope scope = new ManagementScope("HQ\\DEV1"); scope.Options.EnablePrivileges=true; q=new WqlEventQuery(); q.EventClassName+="_InstanceCreationEvent"; q.WithinInterval=new TimeSpan(0,0,3); q.Condition=@"TargetInstance ISA 'Win32_USBControllerdevice'"; w=new ManagementEventWatcher(scope,q); w.EventArrived+=new EventArrivedEventHandler(USBRemoved); w.Start(); }

    Read the article

  • Resolving the WMI DNS Host Name

    - by Stephen Murby
    I am trying to make a comparison between a machine name i have retrieved from AD, and the DNS Host Name i want to get using WMI from the machine. I currently have: foreach (SearchResult oneMachine in allMachinesCollected) { pcName = oneMachine.Properties["name"][0].ToString(); ConnectionOptions setupConnection = new ConnectionOptions(); setupConnection.Username = USERNAME; setupConnection.Password = PASSWORD; setupConnection.Authority = "ntlmdomain:DOMAIN"; ManagementScope setupScope = new ManagementScope("\\\\" + pcName + "\\root\\cimv2", setupConnection); setupScope.Connect(); ObjectQuery dnsNameQuery = new ObjectQuery("SELECT * FROM Win32_ComputerSystem"); ManagementObjectSearcher dnsNameSearch = new ManagementObjectSearcher(setupScope, dnsNameQuery); ManagementObjectCollection allDNSNames = dnsNameSearch.Get(); string dnsHostName; foreach (ManagementObject oneName in allDNSNames) { dnsHostName = oneName.Properties["DNSHostName"].ToString(); if (dnsHostName == pcName) { shutdownMethods.ShutdownMachine(pcName, USERNAME, PASSWORD); MessageBox.Show(pcName + " has been sent the reboot command"); } } } } But i get a ManagementException dnsHostName = oneName.Properties["DNSHostName"].ToString(); << here saying not found. Any ideas?

    Read the article

  • Using a service registry that doesn’t suck Part III: Service testing is part of SOA governance

    - by gsusx
    This is the third post of this series intended to highlight some of the principles of modern SOA governance solution. You can read the first two parts here: Using a service registry that doesn’t suck part I: UDDI is dead Using a service registry that doesn’t suck part II: Dear registry, do you have to be a message broker? This time I’ve decided to focus on what of the aspects that drives me ABSOLUTELY INSANE about traditional SOA Governance solutions: service testing or I should I say the lack of...(read more)

    Read the article

  • WMI/VBS/HTML System Information Script

    - by Methical
    Hey guys; havin' a problem with this code here; can't seem to work out whats goin' wrong with it. All other variables seem to print fine in the HTML ouput; but I get an error that relates to the cputype variable. I get the following error C:\Users\Methical\Desktop\sysinfo.vbs(235,1) Microsoft VBScript runtime error: Invalid procedure call or argument I think it has somethin' to do with this line here fileOutput.WriteLine " <TR><TD width='30%' align='left' bgcolor='#e0e0e0'>CPU</TD><td width='70%' bgcolor=#f0f0f0 align=left><i>" & cputype & "</i></td></tr>" If i delete this line; the script compiles and outputs with no errors. Here is the full code below Dim strComputer, objWMIService, propValue, objItem Dim strUserName, strPassword, colItems, SWBemlocator ' This section querries for the workstation to be scanned. UserName = "" Password = "" strComputer = "127.1.1.1" ImgDir = "C:\Scripts\images\" 'Sets up the connections and opjects to be used throughout the script. Set SWBemlocator = CreateObject("WbemScripting.SWbemLocator") Set objWMIService = SWBemlocator.ConnectServer(,"root\CIMV2",strUserName,strPassword) 'This determines the current date and time of the PC being scanned. Set colItems = objWMIService.ExecQuery("SELECT * FROM Win32_LocalTime", "WQL", wbemFlagReturnImmediately + wbemFlagForwardOnly) For Each objItem in colItems If objItem.Minute < 10 Then theMinutes = "0" & objItem.Minute Else theMinutes = objItem.Minute End If If objItem.Second < 10 Then theSeconds = "0" & objItem.Second Else theSeconds = objItem.Second End If DateTime = objItem.Month & "/" & objItem.Day & "/" & objItem.Year & " - " & objItem.Hour & ":" & theMinutes & ":" & theSeconds Next 'Gets some ingomation about the Operating System including Service Pack level. Set colItems = objWMIService.ExecQuery("Select * from Win32_OperatingSystem",,48) For Each objItem in colItems WKID = objItem.CSName WKOS = objItem.Caption CSD = objItem.CSDVersion Architecture = objItem.OSArchitecture SysDir = objItem.SystemDirectory SysDrive = objItem.SystemDrive WinDir = objItem.WindowsDirectory ServicePack = objItem.ServicePackMajorVersion & "." & objItem.ServicePackMinorVersion Next 'This section returns the Time Zone Set colItems = objWMIService.ExecQuery("Select * from Win32_TimeZone") For Each objItem in colItems Zone = objItem.Description Next 'This section displays the Shadow Storage information Set colItems = objWMIService.ExecQuery("Select * from Win32_ShadowStorage") For Each objItem in colItems Allocated = int((objItem.AllocatedSpace/1024)/1024+1) UsedSpace = int((objItem.UsedSpace/1024)/1024+1) MaxSpace = int((objItem.MaxSpace/1024)/1024+1) Next 'This section returns the InstallDate of the OS Set objSWbemDateTime = _ CreateObject("WbemScripting.SWbemDateTime") Set colOperatingSystems = _ objWMIService.ExecQuery _ ("Select * from Win32_OperatingSystem") For Each objOperatingSystem _ in colOperatingSystems objSWbemDateTime.Value = _ objOperatingSystem.InstallDate InstallDate = _ objSWbemDateTime.GetVarDate(False) Next 'This section returns the Video card and current resolution. Set colItems = objWMIService.ExecQuery("Select * from Win32_DisplayConfiguration",,48) For Each objItem in colItems VideoCard = objItem.DeviceName Resolution = objItem.PelsWidth & " x " & objItem.PelsHeight & " x " & objItem.BitsPerPel & " bits" Next 'This section returns the Video card memory. Set objWMIService = GetObject("winmgmts:root\cimv2") Set colItems = objWMIService.ExecQuery ("Select * from Win32_VideoController") For Each objItem in colItems VideoMemory = objItem.AdapterRAM/1024/1024 Next 'This returns various system information including current logged on user, domain, memory, manufacture and model. Set colItems = objWMIService.ExecQuery("Select * from Win32_ComputerSystem",,48) For Each objItem in colItems UserName = objItem.UserName Domain = objItem.Domain TotalMemory = int((objItem.TotalPhysicalMemory/1024)/1024+1) Manufacturer = objItem.Manufacturer Model = objItem.Model SysType = objItem.SystemType Next 'This determines the total hard drive space and free hard drive space. Set colItems = objWMIService.ExecQuery("Select * from Win32_LogicalDisk Where Name='C:'",,48) For Each objItem in colItems FreeHDSpace = Fix(((objItem.FreeSpace/1024)/1024)/1024) TotalHDSpace = Fix(((objItem.Size/1024)/1024)/1024) Next 'This section returns the default printer and printer port. Set colItems = objWMIService.ExecQuery("SELECT * FROM Win32_Printer where Default=True", "WQL", wbemFlagReturnImmediately + wbemFlagForwardOnly) For Each objItem in colItems Printer = objItem.Name PortName = objItem.PortName Next 'This returns the CPU information. Set colItems = objWMIService.ExecQuery("SELECT * FROM Win32_Processor", "WQL", wbemFlagReturnImmediately + wbemFlagForwardOnly) For Each objItem in colItems CPUDesc = LTrim(objItem.Name) Next '// CPU Info For each objCPU in GetObject("winmgmts:{impersonationLevel=impersonate}\\" & strComputer & "\root\cimv2").InstancesOf("Win32_Processor") Select Case objCPU.Family Case 2 cputype = "Unknown" Case 11 cputype = "Pentium brand" Case 12 cputype = "Pentium Pro" Case 13 cputype = "Pentium II" Case 14 cputype = "Pentium processor with MMX technology" Case 15 cputype = "Celeron " Case 16 cputype = "Pentium II Xeon" Case 17 cputype = "Pentium III" Case 28 cputype = "AMD Athlon Processor Family" Case 29 cputype = "AMD Duron Processor" Case 30 cputype = "AMD2900 Family" Case 31 cputype = "K6-2+" Case 130 cputype = "Itanium Processor" Case 176 cputype = "Pentium III Xeon" Case 177 cputype = "Pentium III Processor with Intel SpeedStep Technology" Case 178 cputype = "Pentium 4" Case 179 cputype = "Intel Xeon" Case 181 cputype = "Intel Xeon processor MP" Case 182 cputype = "AMD AthlonXP Family" Case 183 cputype = "AMD AthlonMP Family" Case 184 cputype = "Intel Itanium 2" Case 185 cputype = "AMD Opteron? Family" End Select Next 'This returns the current uptime (time since last reboot) of the system. Set colOperatingSystems = objWMIService.ExecQuery ("Select * from Win32_OperatingSystem") For Each objOS in colOperatingSystems dtmBootup = objOS.LastBootUpTime dtmLastBootupTime = WMIDateStringToDate(dtmBootup) dtmSystemUptime = DateDiff("h", dtmLastBootUpTime, Now) Uptime = dtmSystemUptime Next Function WMIDateStringToDate(dtmBootup) WMIDateStringToDate = CDate(Mid(dtmBootup, 5, 2) & "/" & Mid(dtmBootup, 7, 2) & "/" & Left(dtmBootup, 4) & " " & Mid (dtmBootup, 9, 2) & ":" & Mid(dtmBootup, 11, 2) & ":" & Mid(dtmBootup,13, 2)) End Function dim objFSO Set objFSO = CreateObject("Scripting.FileSystemObject") ' -- The heart of the create file script ----------------------- ' -- Creates the file using the value of strFile on Line 11 ' -------------------------------------------------------------- Set fileOutput = objFSO.CreateTextFile( "x.html", true ) 'Set fileOutput = objExplorer.Document 'This is the code for the web page to be displayed. fileOutput.WriteLine "<html>" fileOutput.WriteLine " <head>" fileOutput.WriteLine " <title>System Information for '" & WKID & "' </title>" fileOutput.WriteLine " </head>" fileOutput.WriteLine " <body bgcolor='#FFFFFF' text='#000000' link='#0000FF' vlink='000099' alink='#00FF00'>" fileOutput.WriteLine " <center>" fileOutput.WriteLine " <h1>System Information for " & WKID & "</h1>" fileOutput.WriteLine " <table border='0' cellspacing='1' cellpadding='1' width='95%'>" fileOutput.WriteLine " <tr><td background='" & ImgDir & "blue_spacer.gif'>" fileOutput.WriteLine " <table border='0' cellspacing='0' cellpadding='0' width='100%'>" fileOutput.WriteLine " <tr><td>" fileOutput.WriteLine " <table border='0' cellspacing='0' cellpadding='0' width='100%'>" fileOutput.WriteLine " <tr>" fileOutput.WriteLine " <td width='5%' align='left' valign='middle' background='" & ImgDir & "blue_spacer.gif'><img src='" & ImgDir & "write.gif'></td>" fileOutput.WriteLine " <td width='95%' align='left' valign='middle' background='" & ImgDir & "blue_spacer.gif'> <font color='#FFFFFF' size='5'>WKInfo - </font><font color='#FFFFFF' size='3'>General information on the Workstation.</font></td>" fileOutput.WriteLine " </tr>" fileOutput.WriteLine " <tr><td colspan='2' bgcolor='#FFFFFF'>" fileOutput.WriteLine " <TABLE width='100%' cellspacing='0' cellpadding='2' border='1' bordercolor='#c0c0c0' bordercolordark='#ffffff' bordercolorlight='#c0c0c0'>" fileOutput.WriteLine" <tr height=2><td height=10 align=center bgcolor=midnightblue colspan=3></td></tr>" fileOutput.WriteLine " <TR><TD align='center' bgcolor='#d0d0d0' colspan='2'><b><h3>Date and Time</h3></b></TD></TR>" fileOutput.WriteLine" <tr height=2><td height=10 align=center bgcolor=midnightblue colspan=3></td></tr>" fileOutput.WriteLine " <TR><TD width='30%' align='left' bgcolor='#e0e0e0'>Date/Time</TD><td width='70%' bgcolor=#f0f0f0 align=left><i>" & DateTime & "</i></td></tr>" fileOutput.WriteLine " <TR><TD width='30%' align='left' bgcolor='#e0e0e0'>System Uptime</TD><td width='70%' bgcolor=#f0f0f0 align=left><i>" & Uptime & " hours</i></td></tr>" fileOutput.WriteLine " <TR><TD width='30%' align='left' bgcolor='#e0e0e0'>Time Zone</TD><td width='70%' bgcolor=#f0f0f0 align=left><i>" & Zone & " </i></td></tr>" fileOutput.WriteLine" <tr height=2><td height=10 align=center bgcolor=midnightblue colspan=3></td></tr>" fileOutput.WriteLine " <TR><TD align='center' bgcolor='#d0d0d0' colspan='2'><b><h3>General Computer Information</h3></b></TD></TR>" fileOutput.WriteLine" <tr height=2><td height=10 align=center bgcolor=midnightblue colspan=3></td></tr>" fileOutput.WriteLine " <TR><TD width='30%' align='left' bgcolor='#e0e0e0'>Manufacturer</TD><td width='70%' bgcolor=#f0f0f0 align=left><i>" & Manufacturer & "</i></td></tr>" fileOutput.WriteLine " <TR><TD width='30%' align='left' bgcolor='#e0e0e0'>Model</TD><td width='70%' bgcolor=#f0f0f0 align=left><i>" & Model & "</i></td></tr>" fileOutput.WriteLine " <TR><TD width='30%' align='left' bgcolor='#e0e0e0'>System Based</TD><td width='70%' bgcolor=#f0f0f0 align=left><i>" & SysType & "</i></td></tr>" fileOutput.WriteLine " <TR><TD width='30%' align='left' bgcolor='#e0e0e0'>Operating System</TD><td width='70%' bgcolor=#f0f0f0 align=left><i>" & WKOS & " " & CSD & " " & Architecture & "</i></td></tr>" fileOutput.WriteLine " <TR><TD width='30%' align='left' bgcolor='#e0e0e0'>Operating System Install Date</TD><td width='70%' bgcolor=#f0f0f0 align=left><i>" & InstallDate & "</i></td></tr>" fileOutput.WriteLine " <TR><TD width='30%' align='left' bgcolor='#e0e0e0'>UserName</TD><td width='70%' bgcolor=#f0f0f0 align=left><i>" & UserName & "</i></td></tr>" fileOutput.WriteLine " <TR><TD width='30%' align='left' bgcolor='#e0e0e0'>Workstation Name</TD><td width='70%' bgcolor=#f0f0f0 align=left><i>" & WKID & "</i></td></tr>" fileOutput.WriteLine " <TR><TD width='30%' align='left' bgcolor='#e0e0e0'>Domain</TD><td width='70%' bgcolor=#f0f0f0 align=left><i>" & Domain & "</i></td></tr>" fileOutput.WriteLine " <TR><TD width='30%' align='left' bgcolor='#e0e0e0'>System Drive</TD><td width='70%' bgcolor=#f0f0f0 align=left><i>" & SysDrive & "</i></td></tr>" fileOutput.WriteLine " <TR><TD width='30%' align='left' bgcolor='#e0e0e0'>System Directory</TD><td width='70%' bgcolor=#f0f0f0 align=left><i>" & SysDir & "</i></td></tr>" fileOutput.WriteLine " <TR><TD width='30%' align='left' bgcolor='#e0e0e0'>Windows Directory</TD><td width='70%' bgcolor=#f0f0f0 align=left><i>" & WinDir & "</i></td></tr>" fileOutput.WriteLine " <TR><TD width='30%' align='left' bgcolor='#e0e0e0'>ShadowStorage Allocated Space</TD><td width='70%' bgcolor=#f0f0f0 align=left><i>" & Allocated & " MB</i></td></tr>" fileOutput.WriteLine " <TR><TD width='30%' align='left' bgcolor='#e0e0e0'>ShadowStorage Used Space</TD><td width='70%' bgcolor=#f0f0f0 align=left><i>" & UsedSpace & " MB</i></td></tr>" fileOutput.WriteLine " <TR><TD width='30%' align='left' bgcolor='#e0e0e0'>ShadowStorage Max Space</TD><td width='70%' bgcolor=#f0f0f0 align=left><i>" & MaxSpace & " MB</i></td></tr>" fileOutput.WriteLine" <tr height=2><td height=10 align=center bgcolor=midnightblue colspan=3></td></tr>" fileOutput.WriteLine " <TR><TD align='center' bgcolor='#d0d0d0' colspan='2'><b><h3>General Hardware Information</h3></b></TD></TR>" fileOutput.WriteLine" <tr height=2><td height=10 align=center bgcolor=midnightblue colspan=3></td></tr>" fileOutput.WriteLine " <TR><TD width='30%' align='left' bgcolor='#e0e0e0'>CPU</TD><td width='70%' bgcolor=#f0f0f0 align=left><i>" & cputype & "</i></td></tr>" fileOutput.WriteLine " <TR><TD width='30%' align='left' bgcolor='#e0e0e0'>Memory</TD><td width='70%' bgcolor=#f0f0f0 align=left><i>" & TotalMemory & " MB</i></td></tr>" fileOutput.WriteLine " <TR><TD width='30%' align='left' bgcolor='#e0e0e0'>Total HDD Space</TD><td width='70%' bgcolor=#f0f0f0 align=left><i>" & TotalHDSpace & " GB</i></td></tr>" fileOutput.WriteLine " <TR><TD width='30%' align='left' bgcolor='#e0e0e0'>Free HDD Space</TD><td width='70%' bgcolor=#f0f0f0 align=left><i>" & FreeHDSpace & " GB</i></td></tr>" fileOutput.WriteLine" <tr height=2><td height=10 align=center bgcolor=midnightblue colspan=3></td></tr>" fileOutput.WriteLine " <TR><TD align='center' bgcolor='#d0d0d0' colspan='2'><b><h3>General Video Card Information</h3></b></TD></TR>" fileOutput.WriteLine" <tr height=2><td height=10 align=center bgcolor=midnightblue colspan=3></td></tr>" fileOutput.WriteLine " <TR><TD width='30%' align='left' bgcolor='#e0e0e0'>Video Card</TD><td width='70%' bgcolor=#f0f0f0 align=left><i>" & VideoCard & "</i></td></tr>" fileOutput.WriteLine " <TR><TD width='30%' align='left' bgcolor='#e0e0e0'>Resolution</TD><td width='70%' bgcolor=#f0f0f0 align=left><i>" & Resolution & "</i></td></tr>" fileOutput.WriteLine " <TR><TD width='30%' align='left' bgcolor='#e0e0e0'>Memory</TD><td width='70%' bgcolor=#f0f0f0 align=left><i>" & VideoMemory & " MB</i></td></tr>" 'This section lists all the current services and their status. fileOutput.WriteLine " <TR><TD align='center' bgcolor='#d0d0d0' colspan='2'><b><h3>Current Service Information</h3></b></TD></TR>" fileOutput.WriteLine " <tr><td colspan='2' bgcolor='#f0f0f0'>" fileOutput.WriteLine " <TABLE width='100%' cellspacing='0' cellpadding='2' border='1' bordercolor='#c0c0c0' bordercolordark='#ffffff' bordercolorlight='#c0c0c0'>" fileOutput.WriteLine " <TR><TD width='70%' align='center' bgcolor='#e0e0e0'><b>Service Name</b></td><TD width='30%' align='center' bgcolor='#e0e0e0'><b>Service State</b></td><tr>" Set colRunningServices = objWMIService.ExecQuery("Select * from Win32_Service") For Each objService in colRunningServices fileOutput.WriteLine " <TR><TD align='left' bgcolor='#f0f0f0'>" & objService.DisplayName & "</TD><td bgcolor=#f0f0f0 align=center><i>" & objService.State & "</i></td></tr>" wscript.echo " <TR><TD align='left' bgcolor='#f0f0f0'>" & objService.DisplayName & "</TD><td bgcolor=#f0f0f0 align=center><i>" & objService.State & "</i></td></tr>" Next fileOutput.WriteLine " </table>" fileOutput.WriteLine " </td></tr>" 'This section lists all the current running processes and some information. fileOutput.WriteLine " <TR><TD align='center' bgcolor='#d0d0d0' colspan='2'><b><h3>Current Process Information</h3></b></TD></TR>" fileOutput.WriteLine " <tr><td colspan='2' bgcolor='#f0f0f0'>" fileOutput.WriteLine " <TABLE width='100%' cellspacing='0' cellpadding='2' border='1' bordercolor='#c0c0c0' bordercolordark='#ffffff' bordercolorlight='#c0c0c0'>" fileOutput.WriteLine " <TR><TD width='10%' align='center' bgcolor='#e0e0e0'><b>PID</b></td><TD width='35%' align='center' bgcolor='#e0e0e0'><b>Process Name</b></td><TD width='40%' align='center' bgcolor='#e0e0e0'><b>Owner</b></td><TD width='15%' align='center' bgcolor='#e0e0e0'><b>Memory</b></td></tr>" Set colProcessList = objWMIService.ExecQuery("Select * from Win32_Process") For Each objProcess in colProcessList colProperties = objProcess.GetOwner(strNameOfUser,strUserDomain) fileOutput.WriteLine " <TR><TD align='center' bgcolor='#f0f0f0'>" & objProcess.Handle & "</td><TD align='center' bgcolor='#f0f0f0'>" & objProcess.Name & "</td><TD align='center' bgcolor='#f0f0f0'>" & strUserDomain & "\" & strNameOfUser & "</td><TD align='center' bgcolor='#f0f0f0'>" & objProcess.WorkingSetSize/1024 & " kb</td><tr>" Next fileOutput.WriteLine " </table>" fileOutput.WriteLine " </td></tr>" 'This section lists all the currently installed software on the machine. fileOutput.WriteLine " <TR><TD align='center' bgcolor='#d0d0d0' colspan='2'><b><i>Installed Software</i></b></TD></TR>" fileOutput.WriteLine " <tr><td colspan='2' bgcolor='#f0f0f0'>" Set colSoftware = objWMIService.ExecQuery ("Select * from Win32_Product") For Each objSoftware in colSoftware fileOutput.WriteLine" <TABLE width='100%' cellspacing='0' cellpadding='2' border='1' bordercolor='#c0c0c0' bordercolordark='#ffffff' bordercolorlight='#c0c0c0'>" fileOutput.WriteLine" <tr><td width=30% align=center bgcolor='#e0e0e0'><b>Name</b></td><td width=30% align=center bgcolor='#e0e0e0'><b>Vendor</b></td><td width=30% align=center bgcolor='#e0e0e0'><b>Version</b></td></tr>" fileOutput.WriteLine" <tr><td align=center bgcolor=#f0f0f0>" & objSoftware.Name & "</td><td align=center bgcolor=#f0f0f0>" & objSoftware.Vendor & "</td><td align=center bgcolor=#f0f0f0>" & objSoftware.Version & "</td></tr>" fileOutput.WriteLine" <tr height=2><td height=10 align=center bgcolor=midnightblue colspan=3></td></tr>" fileOutput.WriteLine" </table>" Next fileOutput.WriteLine " </td></tr>" fileOutput.WriteLine " </table>" fileOutput.WriteLine " </td></tr>" fileOutput.WriteLine " </table>" fileOutput.WriteLine " </td></tr>" fileOutput.WriteLine " </table>" fileOutput.WriteLine " </td></tr>" fileOutput.WriteLine " </table>" fileOutput.WriteLine " <p><small></small></p>" fileOutput.WriteLine " </center>" fileOutput.WriteLine " </body>" fileOutput.WriteLine "<html>" fileOutput.close WScript.Quit

    Read the article

  • SQL Server Service Broker Service Disappearing (Automatically Deleted)?

    - by mwigdahl
    I've implemented a messaging system over SQL Server Service Broker. It is working great, with the sole exception that every once in a while (maybe once per week per server) my initiator service just vanishes without a trace. The corresponding queue is still there, but the service is missing. Obviously this causes problems in my system. It's a simple matter to recreate the service by hand, but I'm confused as to what might cause this behavior. I understand that automatic poison message handling causes queues to be disabled, but I don't see anything that indicates services can be disabled or deleted automatically. When this happens, I usually have a large backlog of messages in multiple application queues, but nothing extreme. Total message backlog is around 200,000. Does anyone know what might be happening here?

    Read the article

  • Android Design - Service vs Thread for Networking

    - by Nevyn
    I am writing an Android app, finally (yay me) and for this app I need persistant, but user closeable, network sockets (yes, more than one). I decided to try my hand at writing my own version of an IRC Client. My design issue however, is I'm not sure how to run the Socket connectivity itself. If I put the sockets at the Activity level, they keeps getting closed shortly after the Activity becomes non-visible (also a problem that needs solving...but I think i figured that one out)...but if I run a "connectivity service", I need to find out if I can have multiple instances of it running (the service, that is...one per server/socket). Either that or a I need a way to Thread the sockets themselves and have multiple threads running that I can still communicate with directly (ID system of some sort). Thus the question: Is it a 'better', or at least more "proper" design pattern, to put the Socket and networking in a service, and have the Activities consume said service...or should I tie the sockets directly to some Threaded Process owned by the UI Activity and not bother with the service implementation at all? I do know better than to put the networking directly on the UI thread, but that's as far as I've managed to get.

    Read the article

  • Prevent service from starting

    - by Evan Plaice
    So, I do Arduino development on my system to program arduinos using the FTDI USB programming interface (if you have no idea what this means don't worry). The issue arises because the FTDI interface uses tty to communicate and it conflicts with one of the default ubuntu services. The default service in question is called brltty (which enables braille accessibility for people with impaired vision). Considering that I don't have any particular use for this service and it's annoying to stop it manually (using 'service brltty stop') after I restart my system... Where would I configure this (or any) service to prevent it from loading on startup? Note: I also have this issue with qemu-kvm conflicting with virtualbox.

    Read the article

  • WCF and Service Registry

    - by TK Lee
    I am about to build some WCF Services. Those services need to communicate to each others too, in some scenarios. I've done some "Google-ing" about Service Registry but can't figure out how to implement service registry with WCF; is there any other alternate? Is there any Microsoft technology available for Service Registry? I'm new to SOA and I will really appreciate any help or guidance (what and where should I exactly look for registry services).

    Read the article

  • Oracle Communications Service Broker is now available at http://edelivery.oracle.com/EPD/Download/ge

    - by francois.deza
    Oracle Communications Service Broker is now available at http://edelivery.oracle.com/EPD/Download/get_form?egroup_aru_number=12359008 and documented at http://edelivery.oracle.com/EPD/Download/get_form?egroup_aru_number=12359013 See also white paper "Transforming Service Delivery with Oracle Service Brokering" at http://www.oracle.com/us/products/servers-storage/servers/netra-carrier-grade/060194.pdf

    Read the article

  • Zookeeper naming service [closed]

    - by kolchanov
    I need a recommendation for naming service implementation. We are ISV and we have a lot of applications (services) with very different protocols such as http (Rest), low level tcp, amqp, diameter, telco protocols Rx, Ry, Ud and many others. We want to simplify configuration, deployment and service discovery procees and it seems that It's time to create central configuration registry. So I have few questions: - is zookeeper suitable for this purpose? - does exists more suitable and more special solution? - best practice for service naming for discoverin. Any standards? - recommendation for service configuration data structure Also we are keeping in mind future tasks For dynamic application distribution in a private cloud. Could you share your real life experience?

    Read the article

  • Python service using Upstart on Ubuntu

    - by Soumya Simanta
    I want to create to deploy a heartbeat service (a python script) as a service using Upstart. My understanding is that I've to add a /etc/init/myheartbeatservice.conf with the following contents. # my heartbeat service description "Heartbeat monitor" start on startup stop on shutdown script exec /path/to/my/python/script.py end script My script starts another service process and the monitors the processes and sends heartbeat to an outside server regularly. Are startup and shutdown the correct events ? Also my script create a new thread. I'm assuming I also need to add fork daemon to my conf file? Thanks.

    Read the article

  • Free web "caching" services for a web service

    - by Jason Banico
    I have a web service on Google App engine whose data is updated on a daily basis. To minimize bandwidth utilization from mobile clients connecting to it, I'd like to instead have an intermediary site where the clients will be getting their data from, and minimizing hits to my service to once or twice a day only. Is there such a service I can use? I'd like to explore this "pull" option first, before considering "push" options such as publishing to a blog site or a free website host that doesn't have bandwidth caps.

    Read the article

  • Associate mount points with local disks in VBscript / WMI

    - by Mark Unwin
    I have a VBscript that outputs various config items about a system. Both hardware and software. I can output disks and their associated partitions. I can output mount points. I do not seem to be able to associate a mount point with a local disk (where it actually is a local disk). I need to be able to do this using VBscript, so as to fit in with the rest of the ~2000 lines of code. I do NOT want to run some other program graphically. I know the disk manager service can show me (My Computer - Manage - Disk Management), but this is not what I need. I need to be able to do this remotely via VBscript. I am open to running an .exe from the VBscript and piping the output back into VBscript and massaging it from there. Any idea's ? Thanks in advance.

    Read the article

  • Does the WMI event Win32_VolumeChangeEvent work on Windows XP

    - by Christian Rodemeyer
    I'm trying to use the following c# code to detect the attached/removed event of usb mass storage devices. I'm using the Win32_VolumeChangeEvent. // Initialize an event watcher and subscribe to events that match this query var _watcher = new ManagementEventWatcher("select * from Win32_VolumeChangeEvent"); _watcher.EventArrived += OnDeviceChanged; _watcher.Start(); void OnDeviceChanged(object sender, EventArrivedEventArgs args) { Console.WriteLine(args.NewEvent.GetText(TextFormat.Mof)); } The problem is that this works fine on Vista but it doesn't work on XP at all (no events received). The Microsoft documentation says that this should work (http://msdn.microsoft.com/en-us/library/aa394516(VS.85).aspx). I googled for this quite a while and found other that have this problem too. But I also found a couple of articles which claim that this kind of query (mostly in vbscript) works with XP. But I cannot find some offical information from microsoft for this issue and I can't believe that Microsoft have overlooked this issue for three service packs. So my question is: has anybody used the Win32_VolumeChangeEvent with success on XP or can provide a link/explanation why it shouldn't work on XP?

    Read the article

  • Finding the Right Solution to Source and Manage Your Contractors

    - by mark.rosenberg(at)oracle.com
    Many of our PeopleSoft Enterprise applications customers operate in service-based industries, and all of our customers have at least some internal service units, such as IT, marketing, and facilities. Employing the services of contractors, often referred to as "contingent labor," to deliver either or both internal and external services is common practice. As we've transitioned from an industrial age to a knowledge age, talent has become a primary competitive advantage for most organizations. Contingent labor offers talent on flexible terms; it offers the ability to scale up operations, close skill gaps, and manage risk in the process of delivering services. Talent comes from many sources and the rise in the contingent worker (contractor, consultant, temporary, part time) has increased significantly in the past decade and is expected to reach 40 percent in the next decade. Managing the total pool of talent in a seamless integrated fashion not only saves organizations money and increases efficiency, but creates a better place for workers of all kinds to work. Although the term "contingent labor" is frequently used to describe both contractors and employees who have flexible schedules and relationships with an organization, the remainder of this discussion focuses on contractors. The term "contingent labor" is used interchangeably with "contractor." Recognizing the importance of contingent labor, our PeopleSoft customers often ask our team, "What Oracle vendor management system (VMS) applications should I evaluate for managing contractors?" In response, I thought it would be useful to describe and compare the three most common Oracle-based options available to our customers. They are:   The enterprise licensed software model in which you implement and utilize the PeopleSoft Services Procurement (sPro) application and potentially other PeopleSoft applications;  The software-as-a-service model in which you gain access to a derivative of PeopleSoft sPro from an Oracle Business Process Outsourcing Partner; and  The managed service provider (MSP) model in which staffing industry professionals utilize either your enterprise licensed software or the software-as-a-service application to administer your contingent labor program. At this point, you may be asking yourself, "Why three options?" The answer is that since there is no "one size fits all" in terms of talent, there is also no "one size fits all" for effectively sourcing and managing contingent workers. Various factors influence how an organization thinks about and relates to its contractors, and each of the three Oracle-based options addresses an organization's needs and preferences differently. For the purposes of this discussion, I will describe the options with respect to (A) pricing and software provisioning models; (B) control and flexibility; (C) level of engagement with contractors; and (D) approach to sourcing, employment law, and financial settlement. Option 1:  Enterprise Licensed Software In this model, you purchase from Oracle the license and support for the applications you need. Typically, you license PeopleSoft sPro as your VMS tool for sourcing, monitoring, and paying your contract labor. In conjunction with sPro, you can also utilize PeopleSoft Human Capital Management (HCM) applications (if you do not already) to configure more advanced business processes for recruiting, training, and tracking your contractors. Many customers choose this enterprise license software model because of the functionality and natural integration of the PeopleSoft applications and because the cost for the PeopleSoft software is explicit. There is no fee per transaction to source each contractor under this model. Our customers that employ contractors to augment their permanent staff on billable client engagements often find this model appealing because there are no fees to affect their profit margins. With this model, you decide whether to have your own IT organization run the software or have the software hosted and managed by either Oracle or another application services provider. Your organization, perhaps with the assistance of consultants, configures, deploys, and operates the software for managing your contingent workforce. This model offers you the highest level of control and flexibility since your organization can configure the contractor process flow exactly to your business and security requirements and can extend the functionality with PeopleTools. This option has proven very valuable and applicable to our customers engaged in government contracting because their contingent labor management practices are subject to complex standards and regulations. Customers find a great deal of value in the application functionality and configurability the enterprise licensed software offers for managing contingent labor. Some examples of that functionality are... The ability to create a tiered network of preferred suppliers including competencies, pricing agreements, and elaborate candidate management capabilities. Configurable alerts and online collaboration for bid, resource requisition, timesheet, and deliverable entry, routing, and approval for both resource and deliverable-based services. The ability to manage contractors with the same PeopleSoft HCM and Projects applications that are used to manage the permanent workforce. Because it allows you to utilize much of the same PeopleSoft HCM and Projects application functionality for contractors that you use for permanent employees, the enterprise licensed software model supports the deepest level of engagement with the contingent workforce. For example, you can: fill job openings with contingent labor; guide contingent workers through essential safety and compliance training with PeopleSoft Enterprise Learning Management; and source contingent workers directly to project-based assignments in PeopleSoft Resource Management and PeopleSoft Program Management. This option enables contingent workers to collaborate closely with your permanent staff on complex, knowledge-based efforts - R&D projects, billable client contracts, architecture and engineering projects spanning multiple years, and so on. With the enterprise licensed software model, your organization maintains responsibility for the sourcing, onboarding (including adherence to employment laws), and financial settlement processes. This means your organization maintains on staff or hires the expertise in these domains to utilize the software and interact with suppliers and contractors. Option 2:  Software as a Service (SaaS) The effort involved in setting up and operating VMS software to handle a contingent workforce leads many organizations to seek a system that can be activated and configured within a few days and for which they can pay based on usage. Oracle's Business Process Outsourcing partner, Provade, Inc., provides exactly this option to our customers. Provade offers its vendor management software as a service over the Internet and usually charges your organization a fee that is a percentage of your total contingent labor spending processed through the Provade software. (Percentage of spend is the predominant fee model, although not the only one.) In addition to lower implementation costs, the effort of configuring and maintaining the software is largely upon Provade, not your organization. This can be very appealing to IT organizations that are thinly stretched supporting other important information technology initiatives. Built upon PeopleSoft sPro, the Provade solution is tailored for simple and quick deployment and administration. Provade has added capabilities to clone users rapidly and has simplified business documents, like work orders and change orders, to facilitate enterprise-wide, self-service adoption with little to no training. Provade also leverages Oracle Business Intelligence Enterprise Edition (OBIEE) to provide integrated spend analytics and dashboards. Although pure customization is more limited than with the enterprise licensed software model, Provade offers a very effective option for organizations that are regularly on-boarding and off-boarding high volumes of contingent staff hired to perform discrete support tasks (for example, order fulfillment during the holiday season, hourly clerical work, desktop technology repairs, and so on) or project tasks. The software is very configurable and at the same time very intuitive to even the most computer-phobic users. The level of contingent worker engagement your organization can achieve with the Provade option is generally the same as with the enterprise licensed software model since Provade can automatically establish contingent labor resources in your PeopleSoft applications. Provade has pre-built integrations to Oracle's PeopleSoft and the Oracle E-Business Suite procurement, projects, payables, and HCM applications, so that you can evaluate, train, assign, and track contingent workers like your permanent employees. Similar to the enterprise licensed software model, your organization is responsible for the contingent worker sourcing, administration, and financial settlement processes. This means your organization needs to maintain the staff expertise in these domains. Option 3:  Managed Services Provider (MSP) Whether you are using the enterprise licensed model or the SaaS model, you may want to engage the services of sourcing, employment, payroll, and financial settlement professionals to administer your contingent workforce program. Firms that offer this expertise are often referred to as "MSPs," and they are typically staffing companies that also offer permanent and temporary hiring services. (In fact, many of the major MSPs are Oracle applications customers themselves, and they utilize the PeopleSoft Solution for the Staffing Industry to run their own business operations.) Usually, MSPs place their staff on-site at your facilities, and they can utilize either your enterprise licensed PeopleSoft sPro application or the Provade VMS SaaS software to administer the network of suppliers providing contingent workers. When you utilize an MSP, there is a separate fee for the MSP's service that is typically funded by the participating suppliers of the contingent labor. Also in this model, the suppliers of the contingent labor (not the MSP) usually pay the contingent labor force. With an MSP, you are intentionally turning over business process control for the advantages associated with having someone else manage the processes. The software option you choose will to a certain extent affect your process flexibility; however, the MSPs are often able to adapt their processes to the unique demands of your business. When you engage an MSP, you will want to give some thought to the level of engagement and "partnering" you need with your contingent workforce. Because the MSP acts as an intermediary, it can be very valuable in handling high volume, routine contracting for which there is a relatively low need for "partnering" with the contingent workforce. However, if your organization (or part of your organization) engages contingent workers for high-profile client projects that require diplomacy, intensive amounts of interaction, and personal trust, introducing an MSP into the process may prove less effective than handling the process with your own staff. In fact, in many organizations, it is common to enlist an MSP to handle contractors working on internal projects and to have permanent employees handle the contractor relationships that affect the portion of the services portfolio focused on customer-facing, billable projects. One of the key advantages of enlisting an MSP is that you do not have to maintain the expertise required for orchestrating the sourcing, hiring, and paying of contingent workers.  These are the domain of the MSPs. If your own staff members are not prepared to manage the essential "overhead" processes associated with contingent labor, working with an MSP can make solid business sense. Proper administration of a contingent workforce can make the difference between project success and failure, operating profit and loss, and legal compliance and fines. Concluding Thoughts There is little doubt that thoughtfully and purposefully constructing a service delivery strategy that leverages the strengths of contingent workers can lead to better projects, deliverables, and business results. What requires a bit more thinking is determining the platform (or platforms) that will enable each part of your organization to best deliver on its mission.

    Read the article

  • SO-Aware Service Explorer – Configure and Export your services from VS 2010 into the repository

    - by cibrax
    We have introduced a new Visual Studio tool called “Service Explorer” as part of the new SO-Aware SDK version 1.3 to help developers to configure and export any regular WCF service into the SO-Aware service repository. This new tool is a regular Visual Studio Tool Window that can be opened from “View –> Other Windows –> Services Explorer”. Once you open the Services Explorer, you will able to see all the available WCF services in the Visual Studio Solution. In the image above, you can see that a “HelloWorld” service was found in the solution and listed under the Tool window on the left. There are two things you can do for a new service in tool, you can either export it to SO-Aware repository or associate it to an existing service version in the repository. Exporting the service to SO-Aware means that you want to create a new service version in the repository and associate the WCF service WSDL to that version. Associating the service means that you want to use a version already created in SO-Aware with the only purpose of managing and centralizing the service configuration in SO-Aware. The option for exporting a service will popup a dialog like the one bellow in which you can enter some basic information about the service version you want to create and the repository location. The option for associating a service will popup a dialog in which you can pick any existing service version repository and the application configuration file that you want to keep in sync for the service configuration. Two options are available for configuring a service, WCF Configuration or SO-Aware. The WCF Configuration option just tells the tool that the service will use the standard WCF configuration section “system.serviceModel” but that section must be updated and kept in sync with the configuration selected for the service in the repository. The SO-Aware configuration option will tell the tool that the service configuration will be resolved at runtime from the repository. For example, selecting SO-Aware will generate the following configuration in the selected application configuration file, <configuration> <configSections> <section name="serviceRepository" type="Tellago.ServiceModel.Governance.ServiceConfiguration.ServiceRepositoryConfigurationSection, Tellago.ServiceModel.Governance.ServiceConfiguration" /> </configSections> <serviceRepository url="http://localhost/soaware/servicerepository.svc"> <services> <service name="ref:HelloWorldService(1.0)@dev" type="SOAwareSampleService.HelloWorldService" /> </services> </serviceRepository> </configuration> As you can see the tool represents a great addition to the toolset that any developer can use to manage and centralize configuration for WCF services. In addition, it can be combined with other useful tools like WSCF.Blue (Web Service Contract First) for generating the service artifacts like schemas, service code or the service WSDL itself.

    Read the article

  • From the Tips Box: Pre-installation Prep Work Makes Service Pack Upgrades Smoother

    - by Jason Fitzpatrick
    Last month Microsoft rolled out Windows 7 Service Pack 1 and, like many SP releases, quite a few people are hanging back to see what happens. If you want to update but still error on the side of caution, reader Ron Troy  offers a step-by-step guide. Ron’s cautious approach does an excellent job minimizing the number of issues that could crop up in a Service Pack upgrade by doing a thorough job updating your driver sets and clearing out old junk before you roll out the update. Read on to see how he does it: Just wanted to pass on a suggestion for people worried about installing Service Packs.  I came up with a ‘method’ a couple years back that seems to work well. Run Windows / Microsoft Update to get all updates EXCEPT the Service Pack. Use Secunia PSI to find any other updates you need. Use CCleaner or the Windows disk cleanup tools to get rid of all the old garbage out there.  Make sure that you include old system updates. Obviously, back up anything you really care about.  An image backup can be real nice to have if things go wrong. Download the correct SP version from Microsoft.com; do not use Windows / Microsoft Update to get it.  Make sure you have the 64 bit version if that’s what you have installed on your PC. Make sure that EVERYTHING that affects the OS is up to date.  That includes all sorts of drivers, starting with video and audio.  And if you have an Intel chipset, use the Intel Driver Utility to update those drivers.  It’s very quick and easy.  For the video and audio drivers, some can be updated by Intel, some by utilities on the vendor web sites, and some you just have to figure out yourself.  But don’t be lazy here; old drivers and Windows Service Packs are a poor mix. If you have 3rd party software, check to see if they have any updates for you.  They might not say that they are for the Service Pack but you cut your risk of things not working if you do this. Shut off the Antivirus software (especially if 3rd party). Reboot, hitting F8 to get the SafeMode menu.  Choose SafeMode with Networking. Log into the Administrator account to ensure that you have the right to install the SP. Run the SP.  It won’t be very fancy this way.  Maybe 45 minutes later it will reboot and then finish configuring itself, finally letting you log in. Total installation time on most of my PC’s was about 1 hour but that followed hours of preparation on each. On a separate note, I recently got on the Nvidia web site and their utility told me I had a new driver available for my GeForce 8600M GS.  This laptop had come with Vista, now has Win 7 SP1.  I had a big surprise from this driver update; the Windows Experience Score on the graphics side went way up.  Kudo’s to Nvidia for doing a driver update that actually helps day to day usage.  And unlike ATI’s updates (which I need for my AGP based system), this update was fairly quick and very easy.  Also, Nvidia drivers have never, as I can recall, given me BSOD’s, many of which I’ve gotten from ATI (TDR errors).How to Enable Google Chrome’s Secret Gold IconHTG Explains: What’s the Difference Between the Windows 7 HomeGroups and XP-style Networking?Internet Explorer 9 Released: Here’s What You Need To Know

    Read the article

  • Installing Windows Management Framework 3.0 basically destroyed WMI, how can I fix it without reinstalling the O.S.?

    - by Massimo
    Related, of course, to this question. Before discovering it was somewhat... dangerous, I installed Windows Management Framework 3.0 on a number of Windows Server 2008 R2 SP1 servers, and WMI got completely trashed on all of them. This is what the WMI namespace looks like on a normal server (this is from Server Manager - Configuration - WMI Control): This is what it looks like after installing WMF 3.0: Yeah. Everything except WMF 3.0's new features is gone. Needless to say, nothing seems to work anymore on those servers. And no, this is not due to some strange installation error, this happened on three servers which were perfectly working before installing WMF 3.0, and on all of them the installation completed succesfully. Admittedly, one of them had a somewhat complex setup (various System Center products and SQL Server instances)... but two of them are just plain standard domain controllers which do nothing else at all. How can I fix this mess without having to reinstall the O.S. on these servers? And why did it happen in the first place?

    Read the article

  • Cannot delete an existing service using sc command: The Specified service does not exist as an installed service

    - by Graviton
    As shown in the picture below, I want to delete MyNewService, but when I type in sc delete MyNewService I simply can't delete it because there is no such servic, due to "the Specified service does not exist as an installed service" error Any ideas how to solve this problem? Edit: as far as the service panel is concerned, the MyNewService is there all the time; I restarted the pc a few times, and it's there.

    Read the article

  • Convert WMI CimType to System.Type

    - by Anonymous Coward
    I am trying to write a generic extension to turn a ManagementObjectCollection into a DataTable. This is just to make things easier for a startup script/program I am writing. I have ran into a problem with CimType. I have included the code I have written so far below. public static DataTable GetData(this ManagementObjectCollection objectCollection) { DataTable table = new DataTable(); foreach (ManagementObject obj in objectCollection) { if (table.Columns.Count == 0) { foreach (PropertyData property in obj.Properties) { table.Columns.Add(property.Name, property.Type); } } DataRow row = table.NewRow(); foreach (PropertyData property in obj.Properties) { row[property.Name] = property.Value; } table.Rows.Add(row); } return table; } } I have found the a method which I think will work at http://www.devcow.com/blogs/adnrg/archive/2005/09/23/108.aspx. However it seems to me like there may be a better way, or even a .net function I am overlooking.

    Read the article

  • Queued Loadtest to remove Concurrency issues using Shared Data Service in OpenScript

    - by stefan.thieme(at)oracle.com
    Queued Processing to remove Concurrency issues in Loadtest ScriptsSome scripts act on information returned by the server, e.g. act on first item in the returned list of pending tasks/actions. This may lead to concurrency issues if the virtual users simulated in a load test scenario are not synchronized in some way.As the load test cases should be carried out in a comparable and straight forward manner simply cancel a transaction in case a collision occurs is clearly not an option. In case you increase the number of virtual users this approach would lead to a high number of requests for the early steps in your transaction (e.g. login, retrieve list of action points, assign an action point to the virtual user) but later steps would be rarely visited successfully or at all, depending on the application logic.A way to tackle this problem is to enqueue the virtual users in a Shared Data Service queue. Only the first virtual user in this queue will be allowed to carry out the critical steps (retrieve list of action points, assign an action point to the virtual user) in your transaction at any one time.Once a virtual user has passed the critical path it will dequeue himself from the head of the queue and continue with his actions. This does theoretically allow virtual users to run in parallel all steps of the transaction which are not part of the critical path.In practice it has been seen this is rarely the case, though it does not allow adding more than N users to perform a transaction without causing delays due to virtual users waiting in the queue. N being the time of the total transaction divided by the sum of the time of all critical steps in this transaction.While this problem can be circumvented by allowing multiple queues to act on individual segments of the list of actions, e.g. per country filter, ends with 0..9 filter, etc.This would require additional handling of these additional queues of slots for the virtual users at the head of the queue in order to maintain the mutually exclusive access to the first element in the list returned by the server at any one time of the load test. Such an improved handling of multiple queues and/or multiple slots is above the subject of this paper.Shared Data Services Pre-RequisitesStart WebLogic Server to host Shared Data ServicesYou will have to make sure that your WebLogic server is installed and started. Shared Data Services may not work if you installed only the minimal installation package for OpenScript. If however you installed the default package including OLT and OTM, you may follow the instructions below to start and verify WebLogic installation.To start the WebLogic Server deployed underneath of Oracle Load Testing and/or Oracle Test Manager you can go to your Start menu, Oracle Application Testing Suite and select the Restart Oracle Application Testing Suite Application Service entry from the Tools submenu.To verify the service has been started you can run the Microsoft Management Console for Services by Selecting Run from the Start Menu and entering services.msc. Look for the entry that reads Oracle Application Testing Suite Application Service, once it has changed it status from Starting to Started you can proceed to verify the login. Please note that this may take several minutes, I would say up to 10 minutes depending on the strength of your CPU horse-power.Verify WebLogic Server user credentialsYou will have to make sure that your WebLogic Server is installed and started. Next open the Oracle WebLogic Server Adminstration Console on http://localhost:8088/console.It may take a while until the application is deployed and started. It may display the following until the Administration Console has been deployed on the fly.Afterwards you can login using the username oats and the password that you selected during install time for your Application Testing Suite administrative purposes.This will bring up the Home page of you WebLogic Server. You have actually verified that you are able to login with these credentials already. However if you want to check the details, navigate to Security Realms, myrealm, Users and Groups tab.Here you could add users to your WebLogic Server which could be used in the later steps. Details on the Groups required for such a custom user to work are exceeding this quick overview and have to be selected with the WebLogic Server Adminstration Guide in mind.Shared Data Services pre-requisites for Load testingOpenScript Preferences have to be set to enable Encryption and provide a default Shared Data Service Connection for Playback.These are pre-requisites you want to use for load testing with Shared Data Services.Please note that the usage of the Connection Parameters (individual directive in the script) for Shared Data Services did not playback reliably in the current version 9.20.0370 of Oracle Load Testing (OLT) and encryption of credentials still seemed to be mandatory as well.General Encryption settingsSelect OpenScript Preferences from the View menu and navigate to the General, Encryption entry in the tree on the left. Select the Encrypt script data option from the list and enter the same password that you used for securing your WebLogic Server Administration Console.Enable global shared data access credentialsSelect OpenScript Preferences from the View menu and navigate to the Playback, Shared Data entry in the tree on the left. Enable the global shared data access credentials and enter the Address, User name and Password determined for your WebLogic Server to host Shared Data Services.Please note, that you may want to replace the localhost in Address with the hosts realname in case you plan to run load tests with Loadtest Agents running on remote systems.Queued Processing of TransactionsEnable Shared Data Services Module in Script PropertiesThe Shared Data Services Module has to be enabled for each Script that wants to employ the Shared Data Service Queue functionality in OpenScript. It can be enabled under the Script menu selecting Script Properties. On the Script Properties Dialog select the Modules section and check Shared Data to enable Shared Data Service Module for your script. Checking the Shared Data Services option will effectively add a line to your script code that adds the sharedData ScriptService to your script class of IteratingVUserScript.@ScriptService oracle.oats.scripting.modules.sharedData.api.SharedDataService sharedData;Record your scriptRecord your script as usual and then add the following things for Queue handling in the Initialize code block, before the first step and after the last step of your critical path and in the Finalize code block.The java code to be added at individual locations is explained in the following sections in full detail.Create a Shared Data Queue in InitializeTo create a Shared Data Queue go to the Java view of your script and enter the following statements to the initialize() code block.info("Create queueA with life time of 120 minutes");sharedData.createQueue("queueA", 120);This will create an instantiation of the Shared Data Queue object named queueA which is maintained for upto 120 minutes.If you want to use the code for multiple scripts, make sure to use a different queue name for each one here and in the subsequent steps. You may even consider to use a dynamic queueName based on filters of your result list being concurrently accessed.Prepare a unique id for each IterationIn order to keep track of individual virtual users in our queue we need to create a unique identifier from the virtual user id and the used username right after retrieving the next record from our databank file.getDatabank("Usernames").getNextDatabankRecord();getVariables().set("usernameValue1","VU_{{@vuid}}_{{@iterationnum}}_{{db.Usernames.Username}}_{{@timestamp}}_{{@random(10000)}}");String usernameValue = getVariables().get("usernameValue1");info("Now running virtual user " + usernameValue);As you can see from the above code block, we have set the OpenScript variable usernameValue1 to VU_{{@vuid}}_{{@iterationnum}}_{{db.Usernames.Username}}_{{@timestamp}}_{{@random(10000)}} which is a concatenation of the virtual user id and the iterationnumber for general uniqueness; as well as the username from our databank, the timestamp and a random number for making it further unique and ease spotting of errors.Not all of these fields are actually required to make it really unique, but adding the queue name may also be considered to help troubleshoot multiple queues.The value is then retrieved with the getVariables.get() method call and assigned to the usernameValue String used throughout the script.Please note that moving the getDatabank("Usernames").getNextDatabankRecord(); call to the initialize block was later considered to remove concurrency of multiple virtual users running with the same userid and therefor accessing the same "My Inbox" in step 6. This will effectively give each virtual user a userid from the databank file. Make sure you have enough userids to remove this second hurdle.Enqueue and attend Queue before Critical PathTo maintain the right order of virtual users being allowed into the critical path of the transaction the following pseudo step has to be added in front of the first critical step. In the case of this example this is right in front of the step where we retrieve the list of actions from which we select the first to be assigned to us.beginStep("[0] Waiting in the Queue", 0);{info("Enqueued virtual user " + usernameValue + " at the end of queueA");sharedData.offerLast("queueA", usernameValue);info("Wait until the user is the first in queueA");String queueValue1 = null;do {// we wait for at least 0.7 seconds before we check the head of the// queue. This is the time it takes one user to move through the// critical path, i.e. pass steps [5] Enter country and [6] Assign// to meThread.sleep(700);queueValue1 = (String) sharedData.peekFirst("queueA");info("The first user in queueA is currently: '" + queueValue1 + "' " + queueValue1.getClass() + " length " + queueValue1.length() );info("The current user is '"+ usernameValue + "' " + usernameValue.getClass() + " length " + usernameValue.length() + ": indexOf " + usernameValue.indexOf(queueValue1) + " equals " + usernameValue.equals(queueValue1) );} while ( queueValue1.indexOf(usernameValue) < 0 );info("Now the user is the first in queueA");}endStep();This will enqueue the username to the tail of our Queue. It will will wait for at least 700 milliseconds, the time it takes for one user to exit the critical path and then compare the head of our queue with it's username. This last step will be repeated while the two are not equal (indexOf less than zero). If they are equal the indexOf will yield a value of zero or larger and we will perform the critical steps.Dequeue after Critical PathAfter the virtual user has left the critical path and complete its last step the following code block needs to dequeue the virtual user. In the case of our example this is right after the action has been actually assigned to the virtual user. This will allow the next virtual user to retrieve the list of actions still available and in turn let him make his selection/assignment.info("Get and remove the current user from the head of queueA");String pollValue1 = (String) sharedData.pollFirst("queueA");The current user is removed from the head of the queue. The next one will now be able to match his username against the head of the queue.Clear and Destroy Queue for FinishWhen the script has completed, it should clear and destroy the queue. This code block can be put in the finish block of your script and/or in a separate script in order to clear and remove the queue in case you have spotted an error or want to reset the queue for some reason.info("Clear queueA");sharedData.clearQueue("queueA");info("Destroy queueA");sharedData.destroyQueue("queueA");The users waiting in queueA are cleared and the queue is destroyed. If you have scripts still executing they will be caught in a loop.I found it better to maintain a separate Reset Queue script which contained only the following code in the initialize() block. I use to call this script to make sure the queue is cleared in between multiple Loadtest runs. This script could also even be added as the first in a larger scenario, which would execute it only once at very start of the Loadtest and make sure the queues do not contain any stale entries.info("Create queueA with life time of 120 minutes");sharedData.createQueue("queueA", 120);info("Clear queueA");sharedData.clearQueue("queueA");This will create a Shared Data Queue instance of queueA and clear all entries from this queue.Monitoring QueueWhile creating the scripts it was useful to monitor the contents, i.e. the current first user in the Queue. The following code block will make sure the Shared Data Queue is accessible in the initialize() block.info("Create queueA with life time of 120 minutes");sharedData.createQueue("queueA", 120);In the run() block the following code will continuously monitor the first element of the Queue and write an informational message with the current username Value to the Result window.info("Monitor the first users in queueA");String queueValue1 = null;do {queueValue1 = (String) sharedData.peekFirst("queueA");if (queueValue1 != null)info("The first user in queueA is currently: '" + queueValue1 + "' " + queueValue1.getClass() + " length " + queueValue1.length() );} while ( true );This script can be run from OpenScript parallel to a loadtest performed by the Oracle Load Test.However it is not recommend to run this in a production loadtest as the performance impact is unknown. Accessing the Queue's head with the peekFirst() method has been reported with about 2 seconds response time by both OpenScript and OTL. It is advised to log a Service Request to see if this could be lowered in future releases of Application Testing Suite, as the pollFirst() and even offerLast() writing to the tail of the Queue usually returned after an average 0.1 seconds.Debugging QueueWhile debugging the scripts the following was useful to remove single entries from its head, i.e. the current first user in the Queue. The following code block will make sure the Shared Data Queue is accessible in the initialize() block.info("Create queueA with life time of 120 minutes");sharedData.createQueue("queueA", 120);In the run() block the following code will remove the first element of the Queue and write an informational message with the current username Value to the Result window.info("Get and remove the current user from the head of queueA");String pollValue1 = (String) sharedData.pollFirst("queueA");info("The first user in queueA was currently: '" + pollValue1 + "' " + pollValue1.getClass() + " length " + pollValue1.length() );ReferencesOracle Functional Testing OpenScript User's Guide Version 9.20 [E15488-05]Chapter 17 Using the Shared Data Modulehttp://download.oracle.com/otn/nt/apptesting/oats-docs-9.21.0030.zipOracle Fusion Middleware Oracle WebLogic Server Administration Console Online Help 11g Release 1 (10.3.4) [E13952-04]Administration Console Online Help - Manage users and groupshttp://download.oracle.com/docs/cd/E17904_01/apirefs.1111/e13952/taskhelp/security/ManageUsersAndGroups.htm

    Read the article

  • Oracle Coherence & Oracle Service Bus: REST API Integration

    - by Nino Guarnacci
    This post aims to highlight one of the features found in Oracle Coherence which allows it to be easily added and integrated inside a wider variety of projects.  The features in question are the REST API exposed by the Coherence nodes, with which you can interact in the wider mode in memory data grid.Oracle Coherence and Oracle Service Bus are natively integrated through a feature found in the Oracle Service Bus, which allows you to use the coherence grid cache during the configuration phase of a business service. This feature allows you to use an intermediate layer of cache to retrieve the answers from previous invocations of the same service, without necessarily having to invoke the real business service again. Directly from the web console of Oracle Service Bus, you can decide the policies of eviction of the objects / answers and define the discriminating parameters that identify their uniqueness.The coherence REST APIs, however, allow you to integrate both products for other necessities enabling realization of new architectures design.  Consider coherence’s node as a simple service which interoperates through the stardard services and in particular REST (with JSON and XML). Thinking of coherence as a company’s shared service, able to have an implementation of a centralized “map and reduce” which you can access  by a huge variety of protocols (transport and envelopes).An amazing step forward for those who still imagine connectors and code. This type of integration does not require writing custom code or complex implementation to be self-supported. The added value is made unique by the incredible value of both products independently, and still more out of their simple and robust integration.As already mentioned this scenario discovers a hidden new door behind the columns of these two products. The door leads to new ideas and perspectives for enterprise architectures that increasingly wink to next-generation applications: simple and dynamic, perhaps towards the mobile and web 2.0.Below, a small and simple demo useful to demonstrate how easily is to integrate these two products using the Coherence REST API. This demo is also intended to imagine new enterprise architectures using this approach.The idea is to create a centralized system of alerting, fed easily from any company’s application, regardless of the technology with which they were built . Then use a representation standard protocol: RSS, using a service exposed by the service bus; So you can browse and search only the alerts that you are interested on, by category, author, title, date, etc etc.. The steps needed to implement this system are very simple and very few. Here they are listed below and described to be easily replicated within your environment. I would remind you that the demo is only meant to demonstrate how easily is to integrate Oracle Coherence and the Oracle Service Bus, and stimulate your imagination to new technological approaches.1) Install the two products: In this demo used (if necessary, consult the installation guides of 2 products)  - Oracle Service Bus ver. 11.1.1.5.0 http://www.oracle.com/technetwork/middleware/service-bus/downloads/index.html - Oracle Coherence ver. 3.7.1 http://www.oracle.com/technetwork/middleware/coherence/downloads/index.html 2) Because you choose to create a centralized alerting system, we need to define a structure type containing some alerting attributes useful to preserve and organize the information of the various alerts sent by the different applications. Here, then it was built a java class named Alert containing the canonical properties of an alarm information:- Title- Description- System- Time- Severity 3) Therefore, we need to create two configuration files for the coherence node, in order to save the Alert objects within the grid, through the rest/http protocol (more than the native API for Java, C + +, C,. Net). Here are the two minimal configuration files for Coherence:coherence-rest-config.xml resty-server-config.xml This minimum configuration allows me to use a distributed cache named "alerts" that can  also be accessed via http - rest on the host "localhost" over port "8080", objects are of type “oracle.cohsb.Alert”. 4) Below  a simple Java class that represents the type of alert messages: 5) At this point we just need to startup our coherence node, able to listen on http protocol to manage the “alerts” cache, which will receive incoming XML or JSON objects of type Alert. Remember to include in the classpath of the coherence node, the Alert java class and the following coherence libraries and configuration files:  At this point, just run the coherence class node “com.tangosol.net.DefaultCacheServer”advising you to set the following parameters:-Dtangosol.coherence.log.level=9 -Dtangosol.coherence.log=stdout -Dtangosol.coherence.cacheconfig=[PATH_TO_THE_FILE]\resty-server-config.xml 6) Let's create a procedure to test our configuration of Coherence and in order to insert some custom alerts in our cache. The technology with which you want to achieve this functionality is fully not considerable: Javascript, Python, Ruby, Scala, C + +, Java.... Because the protocol to communicate with Coherence is simply HTTP / JSON or XML. For this little demo i choose Java: A method to send/put the alert to the cache: A method to query and view the content of the cache: Finally the main method that execute our methods:  No special library added in the classpath for our class (json struct static defined), when it will be executed, it asks some information such as title, description,... in order to compose and send an alert to the cache and then it will perform an inquiry, to the same cache. At this point, a good exercise at this point, may be to create the same procedure using other technologies, such as a simple html page containing some JavaScript code, and then using Python, Ruby, and so on.7) Now we are ready to start configuring the Oracle Service Bus in order to integrate the two products. First integrate the internal alerting system of Oracle Service Bus with our centralized alerting system based on coherence node. This ensures that by monitoring, or directly from within our Proxy Message Flow, we can throw alerts and save them directly into the Coherence node. To do this I choose to use the jms technology, natively present inside the Oracle Weblogic / Service Bus. Access to the Oracle WebLogic Administration console and create and configure a new JMS connection factory and a new jms destination (queue). Now we should create a new resource of type “alert destination” within our Oracle Service Bus project. The new “alert destination” resource should be configured using the newly created connection factory jms and jms destination. Finally, in order to withdraw the message alert enqueued in our JMS destination and send it to our coherence node, we just need to create a new business service and proxy service within our Oracle Service Bus project.Our business service is responsible for sending a message to our REST service Coherence using as a method action: PUT Finally our proxy service have to collect all messages enqueued on the destination, execute an xquery transformation on those messages  in order to translate them into valid XML / alert objects useful to be sent to our coherence service, through the newly created business service. The message flow pipeline containing the xquery transformation: Incredibly,  we just did a basic first integration between the native alerting system of Oracle Service Bus and our centralized alerting system by simply configuring our coherence node without developing anything.It's time to test it out. To do this I create a proxy service able to generate an alert using our "alert destination", whenever the proxy is invoked. After some invocation to our proxy that generates fake alerts, we could open an Internet browser and type the URL  http://localhost: 8080/alerts/  so we could see what has been inserted within the coherence node. 8) We are ready for the final step.  We would create a new message flow, that can be used to search and display the results in standard mode. To do this I choosen the standard representation of RSS, to display a formatted result on a huge variety of devices such as readers for the iPhone and Android. The inquiry may be defined already at the time of the request able to return only feed / items related to our needs. To do this we need to create a new business service, a new proxy service, and finally a new XQuery Transformation to take care of translating the collection of alerts that will be return from our coherence node in a nicely formatted RSS standard document.So we start right from this resource (xquery), which has the task of transforming a collection of alerts / xml returned from the node coherence in a type well-formatted feed RSS 2.0 our new business service that will search the alerts on our coherence node using the Rest API. And finally, our last resource, the proxy service that will be exposed as an RSS / feeds to various mobile devices and traditional web readers, in which we will intercept any search query, and transform the result returned by the business service in an RSS feed 2.0. The message flow with the transformation phase (Alert TO Feed Items): Finally some little tricks to follow during the routing to the business service, - check for any queries present in the url to require a subset of alerts  - the http header "Accept" to help get an answer XML instead of JSON: In our little demo we also static added some coherence parameters to the request:sort=time:desc;start=0;count=100I would like to get from Coherence that the results will be sorted by date, and starting from 1 up to a maximum of 100.Done!!Just incredible, our centralized alerting system is ready. Inheriting all the qualities and capabilities of the two products involved Oracle Coherence & Oracle Service Bus: - RASP (Reliability, Availability, Scalability, Performance)Now try to use your mobile device, or a normal Internet browser by accessing the RSS just published: Some urls you may test: Search for the last 100 alerts : http://localhost:7001/alarmsSearch for alerts that do not have time set to null (time is not null):http://localhost:7001/alarms?q=time+is+not+nullSearch for alerts that the system property is “Web Browser” (system = ‘Web Browser’):http://localhost:7001/alarms?q=system+%3D+%27Web+Browser%27Search for alerts that the system property is “Web Browser” and the severity property is “Fatal” and the title property contain the word “Javascript”  (system = ‘Web Broser’ and severity = ‘Fatal’ and title like ‘%Javascript%’)http://localhost:8080/alerts?q=system+%3D+%27Web+Browser%27+AND+severity+%3D+%27Fatal%27+AND+title+LIKE+%27%25Javascript%25%27 To compose more complex queries about your need I would suggest you to read the chapter in the coherence documentation inherent the Cohl language (Coherence Query Language) http://download.oracle.com/docs/cd/E24290_01/coh.371/e22837/api_cq.htm . Some useful links: - Oracle Coherence REST API Documentation http://download.oracle.com/docs/cd/E24290_01/coh.371/e22839/rest_intro.htm - Oracle Service Bus Documentation http://download.oracle.com/docs/cd/E21764_01/soa.htm#osb - REST explanation from Wikipedia http://en.wikipedia.org/wiki/Representational_state_transfer At this URL could be downloaded the whole materials of this demo http://blogs.oracle.com/slc/resource/cosb/coh-sb-demo.zip Author: Nino Guarnacci.

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >