Search Results

Search found 7570 results on 303 pages for 'ms crm 4'.

Page 92/303 | < Previous Page | 88 89 90 91 92 93 94 95 96 97 98 99  | Next Page >

  • Access database Need to prevent from approving overlapping OT.Second Try with modified request Not a programmer [on hold]

    - by user2512764
    Employees Signups on company Website for advance overtime line. Access table already has overtime signups which does not require user to add the time but it requires only to add location as approved. Since this table has field Employee name, Date, start time and End time and location, All the fields has the data except for location. In the data base I have created a form based on this table. Since the table already have most of the information User only has to add location in the form field in order to approve overtime. Once user approves an overtime line for example: User approves overtime for employee name 'John' which starts on 7/1/2013 at 0400-0800, location is successfully added. When user tries to add location for John again which might has the start time for 7/1/2013 at 0600=0900. Again we are not entering Start time, End time and date it is already in the table. we are only entering location as approval. Soon user enters the location for John in the form field, since there is a conflict with previously overtime line which has already been approved. program needs to check employee name, date and time in previously approved (Added location) overtime line and The location in current record needs to be deleted and go to next record. I hope I have explained it in understandable format. Thank You,

    Read the article

  • Multiple &(AND) fails in query

    - by N e w B e e
    here is my query $sql = 'SELECT * FROM Orders INNER JOIN [Order Details] ON Orders.OrderNumber = [Order Details].OrderNumber WHERE Orders.CartID =2 AND [Order Details].Option10 Is Null AND [Order Details].Status="Shipped"'; this queries when entered in MS_Access sql view, returns the correct results, but when I copy and paste the same query in my php script, it fails and gives the error Too few parameters, expected 1... although data is there, query is working in access... Please note if I omitted on AND condition, it works eg if I removed shipped conidtion or is null condition, it works then too.. any hint? whats wrong with it?? any help?thanks

    Read the article

  • Docmd.TransferText to update data

    - by every_answer_gets_a_point
    i am using Docmd.TransferText to import data from a text file into my access table. i would like it to do the following: if the record already exists, then update it if the record does not exist then add it how do i accomplish this? currently i have this line: DoCmd.TransferText acImportDelim, yesyes, "table3", "C:\requisition_data_dump.txt", True

    Read the article

  • treating paramater as literal

    - by I__
    DoCmd.TransferText acImportDelim, Import-Accounts, "tableImport", _ "C:\Documents and Settings\accounts.txt", True The second parameter: Import-Accounts is the actual name of the saved import specifications. supposedly it does NOT need to be in quotes; however in this case since there is a - there it is treating it as if i were doing an operation. is there a way i can force it to treat it literally instead of as an operation?

    Read the article

  • Diagnosing packet loss / high latency in Ubuntu

    - by Sam Gammon
    We have a Linux box (Ubuntu 12.04) running Nginx (1.5.2), which acts as a reverse proxy/load balancer to some Tornado and Apache hosts. The upstream servers are physically and logically close (same DC, sometimes same-rack) and show sub-millisecond latency between them: PING appserver (10.xx.xx.112) 56(84) bytes of data. 64 bytes from appserver (10.xx.xx.112): icmp_req=1 ttl=64 time=0.180 ms 64 bytes from appserver (10.xx.xx.112): icmp_req=2 ttl=64 time=0.165 ms 64 bytes from appserver (10.xx.xx.112): icmp_req=3 ttl=64 time=0.153 ms We receive a sustained load of about 500 requests per second, and are currently seeing regular packet loss / latency spikes from the Internet, even from basic pings: sam@AM-KEEN ~> ping -c 1000 loadbalancer PING 50.xx.xx.16 (50.xx.xx.16): 56 data bytes 64 bytes from loadbalancer: icmp_seq=0 ttl=56 time=11.624 ms 64 bytes from loadbalancer: icmp_seq=1 ttl=56 time=10.494 ms ... many packets later ... Request timeout for icmp_seq 2 64 bytes from loadbalancer: icmp_seq=2 ttl=56 time=1536.516 ms 64 bytes from loadbalancer: icmp_seq=3 ttl=56 time=536.907 ms 64 bytes from loadbalancer: icmp_seq=4 ttl=56 time=9.389 ms ... many packets later ... Request timeout for icmp_seq 919 64 bytes from loadbalancer: icmp_seq=918 ttl=56 time=2932.571 ms 64 bytes from loadbalancer: icmp_seq=919 ttl=56 time=1932.174 ms 64 bytes from loadbalancer: icmp_seq=920 ttl=56 time=932.018 ms 64 bytes from loadbalancer: icmp_seq=921 ttl=56 time=6.157 ms --- 50.xx.xx.16 ping statistics --- 1000 packets transmitted, 997 packets received, 0.3% packet loss round-trip min/avg/max/stddev = 5.119/52.712/2932.571/224.629 ms The pattern is always the same: things operate fine for a while (<20ms), then a ping drops completely, then three or four high-latency pings (1000ms), then it settles down again. Traffic comes in through a bonded public interface (we will call it bond0) configured as such: bond0 Link encap:Ethernet HWaddr 00:xx:xx:xx:xx:5d inet addr:50.xx.xx.16 Bcast:50.xx.xx.31 Mask:255.255.255.224 inet6 addr: <ipv6 address> Scope:Global inet6 addr: <ipv6 address> Scope:Link UP BROADCAST RUNNING MASTER MULTICAST MTU:1500 Metric:1 RX packets:527181270 errors:1 dropped:4 overruns:0 frame:1 TX packets:413335045 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:0 RX bytes:240016223540 (240.0 GB) TX bytes:104301759647 (104.3 GB) Requests are then submitted via HTTP to upstream servers on the private network (we can call it bond1), which is configured like so: bond1 Link encap:Ethernet HWaddr 00:xx:xx:xx:xx:5c inet addr:10.xx.xx.70 Bcast:10.xx.xx.127 Mask:255.255.255.192 inet6 addr: <ipv6 address> Scope:Link UP BROADCAST RUNNING MASTER MULTICAST MTU:1500 Metric:1 RX packets:430293342 errors:1 dropped:2 overruns:0 frame:1 TX packets:466983986 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:0 RX bytes:77714410892 (77.7 GB) TX bytes:227349392334 (227.3 GB) Output of uname -a: Linux <hostname> 3.5.0-42-generic #65~precise1-Ubuntu SMP Wed Oct 2 20:57:18 UTC 2013 x86_64 GNU/Linux We have customized sysctl.conf in an attempt to fix the problem, with no success. Output of /etc/sysctl.conf (with irrelevant configs omitted): # net: core net.core.netdev_max_backlog = 10000 # net: ipv4 stack net.ipv4.tcp_ecn = 2 net.ipv4.tcp_sack = 1 net.ipv4.tcp_fack = 1 net.ipv4.tcp_tw_reuse = 1 net.ipv4.tcp_tw_recycle = 0 net.ipv4.tcp_timestamps = 1 net.ipv4.tcp_window_scaling = 1 net.ipv4.tcp_no_metrics_save = 1 net.ipv4.tcp_max_syn_backlog = 10000 net.ipv4.tcp_congestion_control = cubic net.ipv4.ip_local_port_range = 8000 65535 net.ipv4.tcp_syncookies = 1 net.ipv4.tcp_synack_retries = 2 net.ipv4.tcp_thin_dupack = 1 net.ipv4.tcp_thin_linear_timeouts = 1 net.netfilter.nf_conntrack_max = 99999999 net.netfilter.nf_conntrack_tcp_timeout_established = 300 Output of dmesg -d, with non-ICMP UFW messages suppressed: [508315.349295 < 19.852453>] [UFW BLOCK] IN=bond1 OUT= MAC=<mac addresses> SRC=118.xx.xx.143 DST=50.xx.xx.16 LEN=68 TOS=0x00 PREC=0x00 TTL=51 ID=43221 PROTO=ICMP TYPE=3 CODE=1 [SRC=50.xx.xx.16 DST=118.xx.xx.143 LEN=40 TOS=0x00 PREC=0x00 TTL=249 ID=10220 DF PROTO=TCP SPT=80 DPT=53817 WINDOW=8190 RES=0x00 ACK FIN URGP=0 ] [517787.732242 < 0.443127>] Peer 190.xx.xx.131:59705/80 unexpectedly shrunk window 1155488866:1155489425 (repaired) How can I go about diagnosing the cause of this problem, on a Debian-family Linux box?

    Read the article

  • What is the best way to build a database from a MS Word document?

    - by Jayron Soares
    Please advise me on how to approach this problem: I have a sequential list of metadata in a document in MS Word. The basic idea is to create a Python algorithm to iterate over the information, retrieving just the name of the PROCESS, when is made a queue, from a database. Example metadata: Process: Process Walker (1965) Exact reference: Walker Process Equipment., Inc. v. Food Machinery Corp. Link: http://caselaw.lp.findlaw.com/scripts/getcase.pl?court=US&vol=382&invol= Type of procedure: Certiorari to the United States Court of Appeals for the Seventh Circuit. Parties: Walker Process Equipment, Inc. Sector: Systems is ... Start Date: October 12-13 Arguedas, 1965 Summary: Food Machinery Company has initiated a process to stop or slow the entry of competitors through the use of a patent obtained by fraud. The case concerned a patent on "knee action swing diffusers" used in aeration equipment for sewage treatment systems, and the question was whether "the maintenance and enforcement of a patent obtained by fraud before the patent office" may be a basis for antitrust punishment. Report of the evolution process: petitioner, in answer to respond... Importance: a) First case which established an analysis for the diagnosis of dispute… There are about 200 pages containing the information above. I have in mind the idea of implementing an algorithm in Python to be able to break this information sequence and try to store it in a web database (an open source application that I’m looking for) in order to allow for free consultations.

    Read the article

  • How do I nstall MS Office 2010 via WINE?

    - by Emeris
    I am trying to install MS Office 2010 on Ubuntu 12.04 on my new MacBook Pro (15"). I already read and followed every existing threads on forums and followed every existing tutorial, but my problem seem unique so far, since whichever solution I try, the problem remains. When I launch PlayOnLinux, two boxes appear one after the other (before the latest upgrade of Ubuntu of last week, the second box did not appear, only the first one did); the first one tells me: Error: PlayOnLinux is unable to find 32-bits OpenGL libraries. You might encounter problem with your games." When I close this window, a second one pops up, stating: Error: PlayOnLinux cannot find 7z. You should install it to use PlayOnLinux. Of course, I tried purging PlayOnLinux (uninstalling it and re-installing it). I also tried other versions of PlayOnLinux. Nothing matters: the problem remains. I did not succeed so far to install 32-bits OpenGL libraries, since I have a Radeon graphics card (which seems to be unusual) and I just cannot find these libraries. Once the two "error" boxes are closed, PlayOnLinux is open, but does not seem to work properly; when I try to install Microsoft Office 2010, nothing happens. When I try to close PlayOnLinux, it is even worse: Unity seems unable to close it (I even had a frozen screen when trying to xkill it through the terminal). I am looking forward to any tips that could help. P.S.: 01:00.0 VGA compatible controller: Advanced Micro Devices [AMD] nee ATI Whistler [AMD Radeon HD 6600M Series]

    Read the article

  • I have 5 days of vouchers for MS training… help me choose? [closed]

    - by Shyatic
    I'm a Microsoft centric guy (systems engineering side) and I already know the syntax of VB, have done VBScript pretty extensively and Excel VBA stuff as well. I want to make the leap into proper programming, probably with C# because it teaches me syntax I can use for Java if I want to go that route at some point. Since I have vouchers for 5 days of programming, and I can understand logic and understand how the .NET framework works... I would love to hear ideas on which MS Courses I should take. My primary focus is to work on web applications with web services that interact and do neat stuff... like for example, to create a 'chat' room or something interactive on the web. Or should I do something with HTML5/JS? I am really not sure... like I said, I want to work to make web services/sites. Not making the next Facebook mind you, but I'd like to work towards something in that spectrum on a much smaller scale. Please give me any advice, I'd like to book these classes asap Obviously getting involved with SQL and things that I will require would be important here.. you guys know better than me! Thanks!

    Read the article

  • New Training and Support Center Coming Soon!

    - by Ruth
    The CRM On Demand Training and Support Center is getting a face lift. In May 2010 we will unveil the new and improved layout, look and feel, and even some new content. Some of you told us loud and clear that you wanted an easier way to find our training courses and other important information. Well, here you are: Immediately you see the look and feel has changed and things have moved around a bit. You may ask, "How can I find the training catalog? Service requests? Downloads?" There are a few ways to find what you're looking for. You may use the search box to find training, quick guides, downloads, best practices, FAQs and more. You may also click the tabs or links in the blue bar, like Browse Training, to browse other documents and information. Here is a brief outline of the tabs and links that will help as you navigate this new tool: The Support tab provides alerts and notifications specific to your application environment. The Get Started tab is organized by role and contains links to resources aimed at helping you get the most out of your first 30 days with CRM On Demand. The Learn More tab outlines information in key topic areas, like administration, integration, and reports. Go to this tab to get the resources you need to move beyond the basics. The Release Information tab contains information specific to the current and upcoming releases of CRM On Demand. Access this tab to learn about and prepare for upgrades to your CRM On Demand application. The Best Practices tab contains a compilation of knowledge gained by experts that work with CRM On Demand day in and day out. Access this knowledge to benefit from their vast experience. The Communities tab offers connections to others in the CRM On Demand community through forums, communities, blogs, and more. The Browse training link opens the training catalog.Take a look at the instructor-led training, Webinars, quick guides, use cases, and tools available to you. The Browse Knowledge link takes you to our knowledge base where you can get answers to frequently asked questions. The Submit a Service Request link directs you to My Oracle Support where you can log a service request. The steps in that process have not changed. The Web Services Library provides simple APIs and a link to Oracle Sample Code where you can get samples that can help you build custom integrations. The Add-On Applications link allows access to our downloadable applications that allow you to extend the functionality of CRM On Demand. The Templates and Tools link provides access to resources that can help you design and build CRM On Demand to meet your company's specific needs. A lot has changed and I know it is a lot to take in. To help you out, we have a printable quick guide that you can use during this transition. As always, let us know what you think: [email protected].

    Read the article

  • The Three-Legged Milk Stool - Why Oracle Fusion Incentive Compensation makes the difference!

    - by Richard Lefebvre
    During the London Olympics, we were exposed to dozens of athletes who worked with sports psychologists to maximize their performance. Executives often hire business psychologists to coach their teams to excellence. In the same vein, Fusion Incentive Compensation can be used to get people to change their sales behavior so we can make our numbers. But what about using incentive compensation solutions in a non-sales scenario to drive change? Recently, I was working an opportunity where a company was having a low user adoption rate for Salesforce.com, which was causing problems for them. I suggested they use Fusion Incentive Comp to change the reps' behavior. We tossed around the idea of tracking user adoption by creating a variable bonus for reps based on how well they forecasted revenues in the new system. Another thought was to reward the reps for how often they logged into the system or for the percentage of leads that became opportunities and turned into revenue. A new twist on a great product. Fusion CRM's Sweet Spot I'm excited about the sales performance management (SPM) tools in Fusion CRM. This trio of Incentive Compensation, Territory Management, and Quota Management sets us apart from the competition because Oracle is the only vendor that provides all three of these capabilities on a single tech stack, in a single application, and with a single look and feel. The niche vendors offer standalone territory or incentive compensation solutions, but then the customer has to custom build the other tools and can end up with a Frankenstein-type environment. On average, companies overpay sales commissions by three to eight percent. You calculate that number for a company the size of Oracle for one quarter and it makes a pretty air-tight financial case for using SPM tools to figure accurate commissions. Plus when sales reps get the right compensation, they can be out selling rather than spending precious time figuring out what they didn't get paid or looking for another job. And one more thing ... Oracle knows incentive comp. We have been a Gartner Market Scope leader in this space for the last five years. Our solution gets high marks because of its scalability and because of its interoperability with other technologies. And now that we're leading with Fusion, our incentive compensation offering includes the innovations that the Fusion team built, plus enhancements from the E-Business Suite Incentive Comp team. It's a case of making a good thing even better. (See product video.) The "Wedge" Apps In a number of accounts that I'm working on, there is a non-Oracle CRM system of record. That gives me the perfect opportunity to introduce the benefits of our SPM tools and to get the customer using Fusion. Then the door is wide open for the company to uptake more of Fusion CRM, especially since all the integrations they need are out of the box. I really believe that implementing this wedge of SPM tools is the ticket to taking market share away from other vendors. It allows us to insert ourselves in an environment where no other CRM solution in the market has the extending capabilities of Fusion. Not Just Your Usual Suspects Usually the stakeholders that I talk to for Territory Management are tightly aligned with the sales management team. When I sell the quota planning tool, I'm talking to finance people on the ERP side of the house who are measuring quotas and forecasting revenue. And then Incentive Comp is of most interest to the sales operations people, and generally these people roll up to either HR or the payroll department. I think of our Fusion SPM tools as a three-legged stool straddling an organization's Sales, Finance, and HR departments. So when you're prospecting for opportunities -- yes, people with a CRM perspective will be very interested -- but don't limit yourselves to that constituency. You might find stakeholders in accounting, revenue planning, or HR compensation teams. You just might discover, as I did at United Airlines, that the HR organization is spearheading the CRM project because incentive compensation is what they need ... and they're the ones with the budget. Jason Loh Global Solutions Manager, Fusion CRM Sales Planning Oracle Corporation

    Read the article

  • Coding With Windows Azure IaaS

    - by Hisham El-bereky
    This post will focus on some advanced programming topics concerned with IaaS (Infrastructure as a Service) which provided as windows azure virtual machine (with its related resources like virtual disk and virtual network), you know that windows azure started as PaaS cloud platform but regarding to some business cases which need to have full control over their virtual machine, so windows azure directed toward providing IaaS. Sometimes you will need to manage your cloud IaaS through code may be for these reasons: Working on hyper-cloud system by providing bursting connector to windows azure virtual machines Providing multi-tenant system which consume windows azure virtual machine Automated process on your on-premises or cloud service which need to utilize some virtual resources We are going to implement the following basic operation using C# code: List images Create virtual machine List virtual machines Restart virtual machine Delete virtual machine Before going to implement the above operations we need to prepare client side and windows azure subscription to communicate correctly by providing management certificate (x.509 v3 certificates) which permit client access to resources in your Windows Azure subscription, whilst requests made using the Windows Azure Service Management REST API require authentication against a certificate that you provide to Windows Azure More info about setting management certificate located here. And to install .cer on other client machine you will need the .pfx file, or if not exist by exporting .cer as .pfx Note: You will need to install .net 4.5 on your machine to try the code So let start This post built on the post sent by Michael Washam "Advanced Windows Azure IaaS – Demo Code", so I'm here to declare some points and to add new operation which is not exist in Michael's demo The basic C# class object used here as client to azure REST API for IaaS service is HttpClient (Provides a base class for sending HTTP requests and receiving HTTP responses from a resource identified by a URI) this object must be initialized with the required data like certificate, headers and content if required. Also I'd like to refer here that the code is based on using Asynchronous programming with calls to azure which enhance the performance and gives us the ability to work with complex calls which depends on more than one sub-call to achieve some operation The following code explain how to get certificate and initializing HttpClient object with required data like headers and content HttpClient GetHttpClient() { X509Store certificateStore = null; X509Certificate2 certificate = null; try { certificateStore = new X509Store(StoreName.My, StoreLocation.CurrentUser); certificateStore.Open(OpenFlags.ReadOnly); string thumbprint = ConfigurationManager.AppSettings["CertThumbprint"]; var certificates = certificateStore.Certificates.Find(X509FindType.FindByThumbprint, thumbprint, false); if (certificates.Count > 0) { certificate = certificates[0]; } } finally { if (certificateStore != null) certificateStore.Close(); }   WebRequestHandler handler = new WebRequestHandler(); if (certificate!= null) { handler.ClientCertificates.Add(certificate); HttpClient httpClient = new HttpClient(handler); //And to set required headers lik x-ms-version httpClient.DefaultRequestHeaders.Add("x-ms-version", "2012-03-01"); httpClient.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/xml")); return httpClient; } return null; }  Let us keep the object httpClient as reference object used to call windows azure REST API IaaS service. For each request operation we need to define: Request URI HTTP Method Headers Content body (1) List images The List OS Images operation retrieves a list of the OS images from the image repository Request URI https://management.core.windows.net/<subscription-id>/services/images] Replace <subscription-id> with your windows Id HTTP Method GET (HTTP 1.1) Headers x-ms-version: 2012-03-01 Body None.  C# Code List<String> imageList = new List<String>(); //replace _subscriptionid with your WA subscription String uri = String.Format("https://management.core.windows.net/{0}/services/images", _subscriptionid);  HttpClient http = GetHttpClient(); Stream responseStream = await http.GetStreamAsync(uri);  if (responseStream != null) {      XDocument xml = XDocument.Load(responseStream);      var images = xml.Root.Descendants(ns + "OSImage").Where(i => i.Element(ns + "OS").Value == "Windows");      foreach (var image in images)      {      string img = image.Element(ns + "Name").Value;      imageList.Add(img);      } } More information about the REST call (Request/Response) located here on this link http://msdn.microsoft.com/en-us/library/windowsazure/jj157191.aspx (2) Create Virtual Machine Creating virtual machine required service and deployment to be created first, so creating VM should be done through three steps incase hosted service and deployment is not created yet Create hosted service, a container for service deployments in Windows Azure. A subscription may have zero or more hosted services Create deployment, a service that is running on Windows Azure. A deployment may be running in either the staging or production deployment environment. It may be managed either by referencing its deployment ID, or by referencing the deployment environment in which it's running. Create virtual machine, the previous two steps info required here in this step I suggest here to use the same name for service, deployment and service to make it easy to manage virtual machines Note: A name for the hosted service that is unique within Windows Azure. This name is the DNS prefix name and can be used to access the hosted service. For example: http://ServiceName.cloudapp.net// 2.1 Create service Request URI https://management.core.windows.net/<subscription-id>/services/hostedservices HTTP Method POST (HTTP 1.1) Header x-ms-version: 2012-03-01 Content-Type: application/xml Body More details about request body (and other information) are located here http://msdn.microsoft.com/en-us/library/windowsazure/gg441304.aspx C# code The following method show how to create hosted service async public Task<String> NewAzureCloudService(String ServiceName, String Location, String AffinityGroup, String subscriptionid) { String requestID = String.Empty;   String uri = String.Format("https://management.core.windows.net/{0}/services/hostedservices", subscriptionid); HttpClient http = GetHttpClient();   System.Text.ASCIIEncoding ae = new System.Text.ASCIIEncoding(); byte[] svcNameBytes = ae.GetBytes(ServiceName);   String locationEl = String.Empty; String locationVal = String.Empty;   if (String.IsNullOrEmpty(Location) == false) { locationEl = "Location"; locationVal = Location; } else { locationEl = "AffinityGroup"; locationVal = AffinityGroup; }   XElement srcTree = new XElement("CreateHostedService", new XAttribute(XNamespace.Xmlns + "i", ns1), new XElement("ServiceName", ServiceName), new XElement("Label", Convert.ToBase64String(svcNameBytes)), new XElement(locationEl, locationVal) ); ApplyNamespace(srcTree, ns);   XDocument CSXML = new XDocument(srcTree); HttpContent content = new StringContent(CSXML.ToString()); content.Headers.ContentType = new System.Net.Http.Headers.MediaTypeHeaderValue("application/xml");   HttpResponseMessage responseMsg = await http.PostAsync(uri, content); if (responseMsg != null) { requestID = responseMsg.Headers.GetValues("x-ms-request-id").FirstOrDefault(); } return requestID; } 2.2 Create Deployment Request URI https://management.core.windows.net/<subscription-id>/services/hostedservices/<service-name>/deploymentslots/<deployment-slot-name> <deployment-slot-name> with staging or production, depending on where you wish to deploy your service package <service-name> provided as input from the previous step HTTP Method POST (HTTP 1.1) Header x-ms-version: 2012-03-01 Content-Type: application/xml Body More details about request body (and other information) are located here http://msdn.microsoft.com/en-us/library/windowsazure/ee460813.aspx C# code The following method show how to create hosted service deployment async public Task<String> NewAzureVMDeployment(String ServiceName, String VMName, String VNETName, XDocument VMXML, XDocument DNSXML) { String requestID = String.Empty;     String uri = String.Format("https://management.core.windows.net/{0}/services/hostedservices/{1}/deployments", _subscriptionid, ServiceName); HttpClient http = GetHttpClient(); XElement srcTree = new XElement("Deployment", new XAttribute(XNamespace.Xmlns + "i", ns1), new XElement("Name", ServiceName), new XElement("DeploymentSlot", "Production"), new XElement("Label", ServiceName), new XElement("RoleList", null) );   if (String.IsNullOrEmpty(VNETName) == false) { srcTree.Add(new XElement("VirtualNetworkName", VNETName)); }   if(DNSXML != null) { srcTree.Add(new XElement("DNS", new XElement("DNSServers", DNSXML))); }   XDocument deploymentXML = new XDocument(srcTree); ApplyNamespace(srcTree, ns);   deploymentXML.Descendants(ns + "RoleList").FirstOrDefault().Add(VMXML.Root);     String fixedXML = deploymentXML.ToString().Replace(" xmlns=\"\"", ""); HttpContent content = new StringContent(fixedXML); content.Headers.ContentType = new System.Net.Http.Headers.MediaTypeHeaderValue("application/xml");   HttpResponseMessage responseMsg = await http.PostAsync(uri, content); if (responseMsg != null) { requestID = responseMsg.Headers.GetValues("x-ms-request-id").FirstOrDefault(); }   return requestID; } 2.3 Create Virtual Machine Request URI https://management.core.windows.net/<subscription-id>/services/hostedservices/<cloudservice-name>/deployments/<deployment-name>/roles <cloudservice-name> and <deployment-name> are provided as input from the previous steps Http Method POST (HTTP 1.1) Header x-ms-version: 2012-03-01 Content-Type: application/xml Body More details about request body (and other information) located here http://msdn.microsoft.com/en-us/library/windowsazure/jj157186.aspx C# code async public Task<String> NewAzureVM(String ServiceName, String VMName, XDocument VMXML) { String requestID = String.Empty;   String deployment = await GetAzureDeploymentName(ServiceName);   String uri = String.Format("https://management.core.windows.net/{0}/services/hostedservices/{1}/deployments/{2}/roles", _subscriptionid, ServiceName, deployment);   HttpClient http = GetHttpClient(); HttpContent content = new StringContent(VMXML.ToString()); content.Headers.ContentType = new System.Net.Http.Headers.MediaTypeHeaderValue("application/xml"); HttpResponseMessage responseMsg = await http.PostAsync(uri, content); if (responseMsg != null) { requestID = responseMsg.Headers.GetValues("x-ms-request-id").FirstOrDefault(); } return requestID; } (3) List Virtual Machines To list virtual machine hosted on windows azure subscription we have to loop over all hosted services to get its hosted virtual machines To do that we need to execute the following operations: listing hosted services listing hosted service Virtual machine 3.1 Listing Hosted Services Request URI https://management.core.windows.net/<subscription-id>/services/hostedservices HTTP Method GET (HTTP 1.1) Headers x-ms-version: 2012-03-01 Body None. More info about this HTTP request located here on this link http://msdn.microsoft.com/en-us/library/windowsazure/ee460781.aspx C# Code async private Task<List<XDocument>> GetAzureServices(String subscriptionid) { String uri = String.Format("https://management.core.windows.net/{0}/services/hostedservices ", subscriptionid); List<XDocument> services = new List<XDocument>();   HttpClient http = GetHttpClient();   Stream responseStream = await http.GetStreamAsync(uri);   if (responseStream != null) { XDocument xml = XDocument.Load(responseStream); var svcs = xml.Root.Descendants(ns + "HostedService"); foreach (XElement r in svcs) { XDocument vm = new XDocument(r); services.Add(vm); } }   return services; }  3.2 Listing Hosted Service Virtual Machines Request URI https://management.core.windows.net/<subscription-id>/services/hostedservices/<service-name>/deployments/<deployment-name>/roles/<role-name> HTTP Method GET (HTTP 1.1) Headers x-ms-version: 2012-03-01 Body None. More info about this HTTP request here http://msdn.microsoft.com/en-us/library/windowsazure/jj157193.aspx C# Code async public Task<XDocument> GetAzureVM(String ServiceName, String VMName, String subscriptionid) { String deployment = await GetAzureDeploymentName(ServiceName); XDocument vmXML = new XDocument();   String uri = String.Format("https://management.core.windows.net/{0}/services/hostedservices/{1}/deployments/{2}/roles/{3}", subscriptionid, ServiceName, deployment, VMName);   HttpClient http = GetHttpClient(); Stream responseStream = await http.GetStreamAsync(uri); if (responseStream != null) { vmXML = XDocument.Load(responseStream); }   return vmXML; }  So the final method which can be used to list all virtual machines is: async public Task<XDocument> GetAzureVMs() { List<XDocument> services = await GetAzureServices(); XDocument vms = new XDocument(); vms.Add(new XElement("VirtualMachines")); ApplyNamespace(vms.Root, ns); foreach (var svc in services) { string ServiceName = svc.Root.Element(ns + "ServiceName").Value;   String uri = String.Format("https://management.core.windows.net/{0}/services/hostedservices/{1}/deploymentslots/{2}", _subscriptionid, ServiceName, "Production");   try { HttpClient http = GetHttpClient(); Stream responseStream = await http.GetStreamAsync(uri);   if (responseStream != null) { XDocument xml = XDocument.Load(responseStream); var roles = xml.Root.Descendants(ns + "RoleInstance"); foreach (XElement r in roles) { XElement svcnameel = new XElement("ServiceName", ServiceName); ApplyNamespace(svcnameel, ns); r.Add(svcnameel); // not part of the roleinstance vms.Root.Add(r); } } } catch (HttpRequestException http) { // no vms with cloud service } } return vms; }  (4) Restart Virtual Machine Request URI https://management.core.windows.net/<subscription-id>/services/hostedservices/<service-name>/deployments/<deployment-name>/roles/<role-name>/Operations HTTP Method POST (HTTP 1.1) Headers x-ms-version: 2012-03-01 Content-Type: application/xml Body <RestartRoleOperation xmlns:i="http://www.w3.org/2001/XMLSchema-instance"> <OperationType>RestartRoleOperation</OperationType> </RestartRoleOperation>  More details about this http request here http://msdn.microsoft.com/en-us/library/windowsazure/jj157197.aspx  C# Code async public Task<String> RebootVM(String ServiceName, String RoleName) { String requestID = String.Empty;   String deployment = await GetAzureDeploymentName(ServiceName); String uri = String.Format("https://management.core.windows.net/{0}/services/hostedservices/{1}/deployments/{2}/roleInstances/{3}/Operations", _subscriptionid, ServiceName, deployment, RoleName);   HttpClient http = GetHttpClient();   XElement srcTree = new XElement("RestartRoleOperation", new XAttribute(XNamespace.Xmlns + "i", ns1), new XElement("OperationType", "RestartRoleOperation") ); ApplyNamespace(srcTree, ns);   XDocument CSXML = new XDocument(srcTree); HttpContent content = new StringContent(CSXML.ToString()); content.Headers.ContentType = new System.Net.Http.Headers.MediaTypeHeaderValue("application/xml");   HttpResponseMessage responseMsg = await http.PostAsync(uri, content); if (responseMsg != null) { requestID = responseMsg.Headers.GetValues("x-ms-request-id").FirstOrDefault(); } return requestID; }  (5) Delete Virtual Machine You can delete your hosted virtual machine by deleting its deployment, but I prefer to delete its hosted service also, so you can easily manage your virtual machines from code 5.1 Delete Deployment Request URI https://management.core.windows.net/< subscription-id >/services/hostedservices/< service-name >/deployments/<Deployment-Name> HTTP Method DELETE (HTTP 1.1) Headers x-ms-version: 2012-03-01 Body None. C# code async public Task<HttpResponseMessage> DeleteDeployment( string deploymentName) { string xml = string.Empty; String uri = String.Format("https://management.core.windows.net/{0}/services/hostedservices/{1}/deployments/{2}", _subscriptionid, deploymentName, deploymentName); HttpClient http = GetHttpClient(); HttpResponseMessage responseMessage = await http.DeleteAsync(uri); return responseMessage; }  5.2 Delete Hosted Service Request URI https://management.core.windows.net/<subscription-id>/services/hostedservices/<service-name> HTTP Method DELETE (HTTP 1.1) Headers x-ms-version: 2012-03-01 Body None. C# code async public Task<HttpResponseMessage> DeleteService(string serviceName) { string xml = string.Empty; String uri = String.Format("https://management.core.windows.net/{0}/services/hostedservices/{1}", _subscriptionid, serviceName); Log.Info("Windows Azure URI (http DELETE verb): " + uri, typeof(VMManager)); HttpClient http = GetHttpClient(); HttpResponseMessage responseMessage = await http.DeleteAsync(uri); return responseMessage; }  And the following is the method which can used to delete both of deployment and service async public Task<string> DeleteVM(string vmName) { string responseString = string.Empty;   // as a convention here in this post, a unified name used for service, deployment and VM instance to make it easy to manage VMs HttpClient http = GetHttpClient(); HttpResponseMessage responseMessage = await DeleteDeployment(vmName);   if (responseMessage != null) {   string requestID = responseMessage.Headers.GetValues("x-ms-request-id").FirstOrDefault(); OperationResult result = await PollGetOperationStatus(requestID, 5, 120); if (result.Status == OperationStatus.Succeeded) { responseString = result.Message; HttpResponseMessage sResponseMessage = await DeleteService(vmName); if (sResponseMessage != null) { OperationResult sResult = await PollGetOperationStatus(requestID, 5, 120); responseString += sResult.Message; } } else { responseString = result.Message; } } return responseString; }  Note: This article is subject to be updated Hisham  References Advanced Windows Azure IaaS – Demo Code Windows Azure Service Management REST API Reference Introduction to the Azure Platform Representational state transfer Asynchronous Programming with Async and Await (C# and Visual Basic) HttpClient Class

    Read the article

  • More CPU cores may not always lead to better performance – MAXDOP and query memory distribution in spotlight

    - by sqlworkshops
    More hardware normally delivers better performance, but there are exceptions where it can hinder performance. Understanding these exceptions and working around it is a major part of SQL Server performance tuning.   When a memory allocating query executes in parallel, SQL Server distributes memory to each task that is executing part of the query in parallel. In our example the sort operator that executes in parallel divides the memory across all tasks assuming even distribution of rows. Common memory allocating queries are that perform Sort and do Hash Match operations like Hash Join or Hash Aggregation or Hash Union.   In reality, how often are column values evenly distributed, think about an example; are employees working for your company distributed evenly across all the Zip codes or mainly concentrated in the headquarters? What happens when you sort result set based on Zip codes? Do all products in the catalog sell equally or are few products hot selling items?   One of my customers tested the below example on a 24 core server with various MAXDOP settings and here are the results:MAXDOP 1: CPU time = 1185 ms, elapsed time = 1188 msMAXDOP 4: CPU time = 1981 ms, elapsed time = 1568 msMAXDOP 8: CPU time = 1918 ms, elapsed time = 1619 msMAXDOP 12: CPU time = 2367 ms, elapsed time = 2258 msMAXDOP 16: CPU time = 2540 ms, elapsed time = 2579 msMAXDOP 20: CPU time = 2470 ms, elapsed time = 2534 msMAXDOP 0: CPU time = 2809 ms, elapsed time = 2721 ms - all 24 cores.In the above test, when the data was evenly distributed, the elapsed time of parallel query was always lower than serial query.   Why does the query get slower and slower with more CPU cores / higher MAXDOP? Maybe you can answer this question after reading the article; let me know: [email protected].   Well you get the point, let’s see an example.   The best way to learn is to practice. To create the below tables and reproduce the behavior, join the mailing list by using this link: www.sqlworkshops.com/ml and I will send you the table creation script.   Let’s update the Employees table with 49 out of 50 employees located in Zip code 2001. update Employees set Zip = EmployeeID / 400 + 1 where EmployeeID % 50 = 1 update Employees set Zip = 2001 where EmployeeID % 50 != 1 go update statistics Employees with fullscan go   Let’s create the temporary table #FireDrill with all possible Zip codes. drop table #FireDrill go create table #FireDrill (Zip int primary key) insert into #FireDrill select distinct Zip from Employees update statistics #FireDrill with fullscan go  Let’s execute the query serially with MAXDOP 1. --Example provided by www.sqlworkshops.com --Execute query with uneven Zip code distribution --First serially with MAXDOP 1 set statistics time on go declare @EmployeeID int, @EmployeeName varchar(48),@zip int select @EmployeeName = e.EmployeeName, @zip = e.Zip from Employees e       inner join #FireDrill fd on (e.Zip = fd.Zip)       order by e.Zip option (maxdop 1) goThe query took 1011 ms to complete.   The execution plan shows the 77816 KB of memory was granted while the estimated rows were 799624.  No Sort Warnings in SQL Server Profiler.  Now let’s execute the query in parallel with MAXDOP 0. --Example provided by www.sqlworkshops.com --Execute query with uneven Zip code distribution --In parallel with MAXDOP 0 set statistics time on go declare @EmployeeID int, @EmployeeName varchar(48),@zip int select @EmployeeName = e.EmployeeName, @zip = e.Zip from Employees e       inner join #FireDrill fd on (e.Zip = fd.Zip)       order by e.Zip option (maxdop 0) go The query took 1912 ms to complete.  The execution plan shows the 79360 KB of memory was granted while the estimated rows were 799624.  The estimated number of rows between serial and parallel plan are the same. The parallel plan has slightly more memory granted due to additional overhead. Sort properties shows the rows are unevenly distributed over the 4 threads.   Sort Warnings in SQL Server Profiler.   Intermediate Summary: The reason for the higher duration with parallel plan was sort spill. This is due to uneven distribution of employees over Zip codes, especially concentration of 49 out of 50 employees in Zip code 2001. Now let’s update the Employees table and distribute employees evenly across all Zip codes.   update Employees set Zip = EmployeeID / 400 + 1 go update statistics Employees with fullscan go  Let’s execute the query serially with MAXDOP 1. --Example provided by www.sqlworkshops.com --Execute query with uneven Zip code distribution --Serially with MAXDOP 1 set statistics time on go declare @EmployeeID int, @EmployeeName varchar(48),@zip int select @EmployeeName = e.EmployeeName, @zip = e.Zip from Employees e       inner join #FireDrill fd on (e.Zip = fd.Zip)       order by e.Zip option (maxdop 1) go   The query took 751 ms to complete.  The execution plan shows the 77816 KB of memory was granted while the estimated rows were 784707.  No Sort Warnings in SQL Server Profiler.   Now let’s execute the query in parallel with MAXDOP 0. --Example provided by www.sqlworkshops.com --Execute query with uneven Zip code distribution --In parallel with MAXDOP 0 set statistics time on go declare @EmployeeID int, @EmployeeName varchar(48),@zip int select @EmployeeName = e.EmployeeName, @zip = e.Zip from Employees e       inner join #FireDrill fd on (e.Zip = fd.Zip)       order by e.Zip option (maxdop 0) go The query took 661 ms to complete.  The execution plan shows the 79360 KB of memory was granted while the estimated rows were 784707.  Sort properties shows the rows are evenly distributed over the 4 threads. No Sort Warnings in SQL Server Profiler.    Intermediate Summary: When employees were distributed unevenly, concentrated on 1 Zip code, parallel sort spilled while serial sort performed well without spilling to tempdb. When the employees were distributed evenly across all Zip codes, parallel sort and serial sort did not spill to tempdb. This shows uneven data distribution may affect the performance of some parallel queries negatively. For detailed discussion of memory allocation, refer to webcasts available at www.sqlworkshops.com/webcasts.     Some of you might conclude from the above execution times that parallel query is not faster even when there is no spill. Below you can see when we are joining limited amount of Zip codes, parallel query will be fasted since it can use Bitmap Filtering.   Let’s update the Employees table with 49 out of 50 employees located in Zip code 2001. update Employees set Zip = EmployeeID / 400 + 1 where EmployeeID % 50 = 1 update Employees set Zip = 2001 where EmployeeID % 50 != 1 go update statistics Employees with fullscan go  Let’s create the temporary table #FireDrill with limited Zip codes. drop table #FireDrill go create table #FireDrill (Zip int primary key) insert into #FireDrill select distinct Zip       from Employees where Zip between 1800 and 2001 update statistics #FireDrill with fullscan go  Let’s execute the query serially with MAXDOP 1. --Example provided by www.sqlworkshops.com --Execute query with uneven Zip code distribution --Serially with MAXDOP 1 set statistics time on go declare @EmployeeID int, @EmployeeName varchar(48),@zip int select @EmployeeName = e.EmployeeName, @zip = e.Zip from Employees e       inner join #FireDrill fd on (e.Zip = fd.Zip)       order by e.Zip option (maxdop 1) go The query took 989 ms to complete.  The execution plan shows the 77816 KB of memory was granted while the estimated rows were 785594. No Sort Warnings in SQL Server Profiler.  Now let’s execute the query in parallel with MAXDOP 0. --Example provided by www.sqlworkshops.com --Execute query with uneven Zip code distribution --In parallel with MAXDOP 0 set statistics time on go declare @EmployeeID int, @EmployeeName varchar(48),@zip int select @EmployeeName = e.EmployeeName, @zip = e.Zip from Employees e       inner join #FireDrill fd on (e.Zip = fd.Zip)       order by e.Zip option (maxdop 0) go The query took 1799 ms to complete.  The execution plan shows the 79360 KB of memory was granted while the estimated rows were 785594.  Sort Warnings in SQL Server Profiler.    The estimated number of rows between serial and parallel plan are the same. The parallel plan has slightly more memory granted due to additional overhead.  Intermediate Summary: The reason for the higher duration with parallel plan even with limited amount of Zip codes was sort spill. This is due to uneven distribution of employees over Zip codes, especially concentration of 49 out of 50 employees in Zip code 2001.   Now let’s update the Employees table and distribute employees evenly across all Zip codes. update Employees set Zip = EmployeeID / 400 + 1 go update statistics Employees with fullscan go Let’s execute the query serially with MAXDOP 1. --Example provided by www.sqlworkshops.com --Execute query with uneven Zip code distribution --Serially with MAXDOP 1 set statistics time on go declare @EmployeeID int, @EmployeeName varchar(48),@zip int select @EmployeeName = e.EmployeeName, @zip = e.Zip from Employees e       inner join #FireDrill fd on (e.Zip = fd.Zip)       order by e.Zip option (maxdop 1) go The query took 250  ms to complete.  The execution plan shows the 9016 KB of memory was granted while the estimated rows were 79973.8.  No Sort Warnings in SQL Server Profiler.  Now let’s execute the query in parallel with MAXDOP 0.  --Example provided by www.sqlworkshops.com --Execute query with uneven Zip code distribution --In parallel with MAXDOP 0 set statistics time on go declare @EmployeeID int, @EmployeeName varchar(48),@zip int select @EmployeeName = e.EmployeeName, @zip = e.Zip from Employees e       inner join #FireDrill fd on (e.Zip = fd.Zip)       order by e.Zip option (maxdop 0) go The query took 85 ms to complete.  The execution plan shows the 13152 KB of memory was granted while the estimated rows were 784707.  No Sort Warnings in SQL Server Profiler.    Here you see, parallel query is much faster than serial query since SQL Server is using Bitmap Filtering to eliminate rows before the hash join.   Parallel queries are very good for performance, but in some cases it can hinder performance. If one identifies the reason for these hindrances, then it is possible to get the best out of parallelism. I covered many aspects of monitoring and tuning parallel queries in webcasts (www.sqlworkshops.com/webcasts) and articles (www.sqlworkshops.com/articles). I suggest you to watch the webcasts and read the articles to better understand how to identify and tune parallel query performance issues.   Summary: One has to avoid sort spill over tempdb and the chances of spills are higher when a query executes in parallel with uneven data distribution. Parallel query brings its own advantage, reduced elapsed time and reduced work with Bitmap Filtering. So it is important to understand how to avoid spills over tempdb and when to execute a query in parallel.   I explain these concepts with detailed examples in my webcasts (www.sqlworkshops.com/webcasts), I recommend you to watch them. The best way to learn is to practice. To create the above tables and reproduce the behavior, join the mailing list at www.sqlworkshops.com/ml and I will send you the relevant SQL Scripts.   Register for the upcoming 3 Day Level 400 Microsoft SQL Server 2008 and SQL Server 2005 Performance Monitoring & Tuning Hands-on Workshop in London, United Kingdom during March 15-17, 2011, click here to register / Microsoft UK TechNet.These are hands-on workshops with a maximum of 12 participants and not lectures. For consulting engagements click here.   Disclaimer and copyright information:This article refers to organizations and products that may be the trademarks or registered trademarks of their various owners. Copyright of this article belongs to R Meyyappan / www.sqlworkshops.com. You may freely use the ideas and concepts discussed in this article with acknowledgement (www.sqlworkshops.com), but you may not claim any of it as your own work. This article is for informational purposes only; you use any of the suggestions given here entirely at your own risk.   Register for the upcoming 3 Day Level 400 Microsoft SQL Server 2008 and SQL Server 2005 Performance Monitoring & Tuning Hands-on Workshop in London, United Kingdom during March 15-17, 2011, click here to register / Microsoft UK TechNet.These are hands-on workshops with a maximum of 12 participants and not lectures. For consulting engagements click here.   R Meyyappan [email protected] LinkedIn: http://at.linkedin.com/in/rmeyyappan  

    Read the article

  • Parner Webcast - Innovations in Products Program

    - by Richard Lefebvre
    We are pleased to invite you to join the Innovations in Products –webcast. Innovations in Products will present Oracle Applications' Product's new functions and features including sales positioning. The key objectives of these webcasts are to inspire System Integrator's implementation personnel to conduct successful after sales in their Customer projects. Innovations in Products will be presented on the 1st Monday of each quarter after the billable day (4:00 to 5:00 PM CET). The webcast is intended for System Integrator's Implementation Certified Specialists but Innovations in Products is open for other interested Oracle Applications system Integrator's personnel as well. At first, two Oracle representatives will discuss Oracle's contribution to Partners. Then you will see product breakout session followed by Q&A with Oracle Experts. Each session will last for maximum 1 hour. A Q&A document covering all questions and answers will be made available after the webcast. What are the Benefits for partners? Find out how Innovations in Products helps you to improve your after sales Discover new functions and features so you can enrich your Customers's solution Learn more about Oracle Applications products, especially sales positioning Hear crucial questions raised by colleague alike, learn from their interest Engage and present your questions to subject experts Be inspired of the richness of Oracle Application portfolio – for your and your customer’s benefit Note: Should you already be familiar with a specific Product, then choose another one. Doing so you would expand your knowledge of the overall Applications portfolio. Some presentations contain product demonstration, although these presentations are not intended to be extremely detailed technical presentations. Note: At the latter part of this email you have also 17 links into the recent Applications Products presentations and 6 links into the Public Sector Value Proposition presentations that were presented in Innovations in Industries -program. Product breakout sessions: Topics Speaker To Register Fusion Applications Technology and Extensibility: A next-generation platform that adapts to client needs. Matthew Johnson, Sr. Director, SCM Product Development, EMEA CLICK HERE Fusion Applications - Transforming your Back-Office Accounting Function: Changing how people work in back office functions to drive value add Liam Nolan, Director, ERP Product Development, EMEA CLICK HERE Fusion HCM & Talent Overview & Extensibility: A more in-depth look into a personalized HCM solution Synco Jonkeren, Vice-President HCM Product Development & Management, EMEA CLICK HERE Fusion HCM Compensation Planning: Compensate To Compete Rosie Warner, Director, HCM Sales Development CLICK HERE Enterprise PLM for the Product Value Chain: Oracle Enterprise PLM offers Industry specific solutions that cover the Product Value Chain Ulf Köster, Sales Development Leader Enterprise PLM, Oracle Western Europe CLICK HERE Oracle's Asset Management and Maintenance Solution: What you need to know to successfully implement Oracle Asset Management solutions within Oracle Installed Base Philip Carey, Asset Management and Maintenance Solution Specialist CLICK HERE For more details please visit Innovations in Products and other breakout sessions on OPN page. Delivery Format Innovations in Products –program is a series of FREE prerecorded Applications product presentations followed by Q&A. It will be delivered over the Web. Participants have the opportunity to submit questions during the web cast via chat and subject matter experts will provide verbal answers live. Innovations in Products consists of several parallel prerecorded product breakout sessions, each lasting for max. 1 hour. At first, two Oracle representatives will discuss Oracle’s contribution to Partners. Then you’ll see the product breakout sessions followed by Q&A with Oracle Experts. A Q&A document covering all questions and answers will be made available after the webcast. You can also see Innovations in Products afterwards as its content will be available online for the next 6-12 months. The next Innovations in Products web casts will be presented as follows: July 2nd 2012 October 1st 2012 January 14th 2013 April 8th 2013. Note: Depending on local network bandwidth please allow some seconds time the presentations to download. You might want to refresh your screen by pressing F5. Duration Maximum 1 hour For further information please contact me Markku Rouhiainen. Recent Innovations in Products presentations Applications Products presented on April the 2nd, 2012 Speaker To Register Fusion CRM: Effective, Efficient and Easy James Penfold , Senior Director, Applications Product Development and Product Management CLICK HERE Fusion HCM: Talent management overview performance, goals, talent review Jaime Losantos Viñolas, Director, HCM Sales Development CLICK HERE Distributed Order Management - Fusion SCM Solution Vikram K Singla, Business Development Director, Supply Chain Management Applications, UK CLICK HERE Oracle Transportation Management Dominic Regan, Senior Director Oracle Transportation Management EMEA CLICK HERE Oracle Value Chain Planning: Demantra Sales & Operation Planning and Demantra Demand Management Lionel Albert, Senior Director Value Chain Planning, EMEA CLICK HERE Oracle CX (Customer Experience) - formerly CEM: Powering Great Customer Experiences Maria Ramirez , CRM Presales Consultant, EPC CLICK HERE EPM 11.1.2.2 Overview Nicholas Cox , EMEA Sales Development Director - Enterprise Performance Management CLICK HERE Oracle Hyperion Profitability and Cost Management, 11.1.2.1 Daniela Lazar , Senior EPM Sales Consultant, EPC CLICK HERE January the 16th 2012 Speaker To Register CRM / ATG: Best-in-Class CRM & Commerce Maria Ramirez , Associate CRM Presales Consultant, EPC CLICK HERE CRM / Automate Business Rules for Maximum Efficiency with OPA (Oracle Policy Automation) Marco Nilo, Associate CRM Presales Consultant, EPC CLICK HERE CRM / InQuira Toby Baker, Principal Sales Consultant, CRM Product Specialist Team CLICK HERE EPM / Business Intelligence Foundation Suite – Sales and Product Updates Liviu Nitescu, Senior BI Sales Consultant, EPC CLICK HERE EPM / Hyperion Planning 11.1.2.1 - Sales & Product Updates Andreea Voinea, EPM Sales Consultant, EPC CLICK HERE ERP / JDE EnterpriseOne Fulfillment Management Overview Mirela Andreea Nasta , ERP Presales Consultant, EPC CLICK HERE ERP / Spotlights on iExpenses Elena Nita ,ERP Presales Consultant, EPC CLICK HERE MDM / Master Data Management Martin Boyd , Senior Director Product Strategy CLICK HERE Product break through session Fusion Applications Human Capital Management Rosie Warner , Director, HCM Sales Development CLICK HERE Recent Innovations in Industries Value Proposition presentations January the 16th 2012 Speaker To Register Process Modernisation Iemke Idsingh Public Sector Solutions Director CLICK HERE Shared Services Ann Smith Business Development Director, Shared Services CLICK HERE Strengthening Financial Discipline Whilst Delivering Cashable Savings Philippa Headley UK Sales Development Director Public Sector - EPM Solutions CLICK HERE Social Welfare Industry Solutions Christian Wernberg-Tougaard Industry Director - Social Welfare CLICK HERE Police Industry Solutions Jeff Penrose Solution Sales Director CLICK HERE Tax and Revenue Management Industry Solutions Andre van der Post Global Director - Tax Solutions and Strategy CLICK HERE  

    Read the article

  • Is there a setting in Exchange Server 2007 that we can set to make these headers propogate and be received by a POP/IMAP client?

    - by Ruruboy
    When using EWS Managed API to send Email via Exchange Server 2007. I noticed that MAPI clients like MS Outlook display all custom headers. But when I use POP3/IMAP clients like MS Outlook Express. I have noticed that these custom headers do not display in the message opened from MS Outlook Express. Is there a setting in Exchange Server 2007 that we can set to make these custom headers propagate and be received by a POP/IMAP client? Also why do custom headers in example below display up in lower case in MAPI clients like MS Outlook? But surprisingly if we use SMTPClient class to send email then these headers display as sent with Case Sensitive letters. eg. Header. Example of Headers received by a MAPI client like MS Outlook via Exchange Server 2007 Received: from EXMAILVS1.blabla.com ([192.168.191.136]) by cashtp02.blabla.com ([XXX.XXX.XX.XXX]) with mapi; Mon, 20 Dec 2010 12:17:05 -0800 Content-Type: application/ms-tnef; name="winmail.dat" Content-Transfer-Encoding: binary From: asfsdf <[email protected]> To: asdsdf <[email protected]> Date: Mon, 20 Dec 2010 12:17:04 -0800 Subject: Please send me this header Thread-Topic: Please send me this header Thread-Index: AQHLoILek7g5cFgHQU6lHHfiKkdUMg== Message-ID: <[email protected]> Accept-Language: en-US Content-Language: en-US X-MS-Has-Attach: X-MS-Exchange-Organization-SCL: -1 X-MS-TNEF-Correlator: <[email protected]> customheader1: hello ali customheader2: hello Jace MIME-Version: 1.0

    Read the article

  • Cannot access certain URL on my wireless

    - by dehmann
    Problem: On my wireless network at home, there is one URL that I just cannot access with my browser: http://research.microsoft.com/ I have no problems with the Internet connection otherwise. But on that address I just get The connection was reset The connection to the server was reset while the page was loading. from Firefox. I am using a DSL modem (Westell) and Linksys wireless router (using DHCP). When I use my neighbor's wireless connection I can access the microsoft site without a problem. Additional technical details: But with my connection, here is what I get from nslookup. It is weird: It first cannot find the address, but after I look up another address it can find it: $ nslookup research.microsoft.com ;; connection timed out; no servers could be reached $ nslookup google.com Non-authoritative answer: Name: google.com Address: 72.14.204.104 Name: google.com Address: 72.14.204.147 Name: google.com Address: 72.14.204.99 Name: google.com Address: 72.14.204.103 $ nslookup research.microsoft.com Non-authoritative answer: Name: research.microsoft.com Address: 131.107.65.14 But even after nslookup finds it Firefox still cannot access it. Here is what traceroute says: $ traceroute http://research.microsoft.com/ traceroute: Warning: http://research.microsoft.com/ has multiple addresses; using 8.15.7.117 traceroute to http://research.microsoft.com/ (8.15.7.117), 64 hops max, 40 byte packets 1 dslrouter.westell.com (1XX.XXX.X.X) 4.515 ms 2.760 ms 3.072 ms 2 * * * Traceroute just to the IP: $ traceroute 131.107.65.14 traceroute to 131.107.65.14 (131.107.65.14), 64 hops max, 40 byte packets 1 dslrouter.westell.com (1XX.XXX.X.X) 11.912 ms 2.684 ms 2.808 ms 2 * * * Comparison: Traceroute to google.com IP: $ traceroute 72.14.204.99 traceroute to 72.14.204.99 (72.14.204.99), 64 hops max, 40 byte packets 1 dslrouter.westell.com (1XX.XXX.X.X) 6.428 ms 6.981 ms 117.099 ms 2 * * * Any comments / help?

    Read the article

  • VPN - local and remote networks IP collision

    - by Guido García
    I have created a VPN connection in Windows using the New Network Connection wizard that comes with Windows. It works without problems in most places, but there is one concrete place where, despite the connection to the remote public IP works fine, it is not able to validate the login/password and establish the VPN connection. In this place, the network is 10.0.0.x (the same I use in other places where I am able to connect). The remote network is 192.168.x.x, so I suspect there is some kind of IP collision, because before connecting, a traceroute to i.e. 192.168.0.40 does not fail. 1 4 ms 1 ms 1 ms LINKSYS [10.0.0.1] 2 5 ms 1 ms 1 ms 172.26.27.1 3 4 ms 5 ms 3 ms 192.168.1.100 ... (more) I can't modify the local network further than the first router (10.0.0.1). That is the only different I've found so far. Any idea about how to solve it? Thank you.

    Read the article

  • VPN - local and remote networks IP collision

    - by Guido García
    I have created a VPN connection in Windows using the New Network Connection wizard that comes with Windows. It works without problems in most places, but there is one concrete place where, despite the connection to the remote public IP works fine, it is not able to validate the login/password and establish the VPN connection. In this place, the network is 10.0.0.x (the same I use in other places where I am able to connect). The remote network is 192.168.x.x, so I suspect there is some kind of IP collision, because before connecting, a traceroute to i.e. 192.168.0.40 does not fail. 1 4 ms 1 ms 1 ms LINKSYS [10.0.0.1] 2 5 ms 1 ms 1 ms 172.26.27.1 3 4 ms 5 ms 3 ms 192.168.1.100 ... (more) I can't modify the local network further than the first router (10.0.0.1). That is the only different I've found so far. Any idea about how to solve it? Thank you.

    Read the article

  • IPv6 works only after ping to routing box

    - by Ficik
    Situation: There is ipv4 only router in network and every computer is connected to it (wifi or cable). Server with ipv4 and ipv6 is connected to this router as well. Server has configured tunnelbrokers 6to4 tunnel and radvd. Clients in network has right prefix and can ping each other. But they can't ping to internet until they ping Server (the one with tunnel). I found somewhere that it's icmp problem, but I couldn't find solution. Is it problem that there is ipv4 only router? server and client runs linux router runs dd-wrt without ipv6 support :( Ping try: standa@standa-laptop:~$ ping6 ipv6.google.com PING ipv6.google.com(2a00:1450:8007::69) 56 data bytes ^C --- ipv6.google.com ping statistics --- 29 packets transmitted, 0 received, 100% packet loss, time 28223ms standa@standa-laptop:~$ ping6 2001:470:XXXX:XXXX:21c:c0ff:fe2b:6478 PING 2001:470:XXXX:XXXX:21c:c0ff:fe2b:6478(2001:470:XXXX:XXXX:21c:c0ff:fe2b:6478) 56 data bytes 64 bytes from 2001:470:XXXX:XXXX:21c:c0ff:fe2b:6478: icmp_seq=1 ttl=64 time=3.55 ms 64 bytes from 2001:470:XXXX:XXXX:21c:c0ff:fe2b:6478: icmp_seq=2 ttl=64 time=0.311 ms 64 bytes from 2001:470:XXXX:XXXX:21c:c0ff:fe2b:6478: icmp_seq=3 ttl=64 time=0.269 ms 64 bytes from 2001:470:XXXX:XXXX:21c:c0ff:fe2b:6478: icmp_seq=4 ttl=64 time=0.292 ms ^C --- 2001:470:XXXX:XXXX:21c:c0ff:fe2b:6478 ping statistics --- 4 packets transmitted, 4 received, 0% packet loss, time 3000ms rtt min/avg/max/mdev = 0.269/1.107/3.559/1.415 ms standa@standa-laptop:~$ ping6 ipv6.google.com PING ipv6.google.com(2a00:1450:8007::69) 56 data bytes 64 bytes from 2a00:1450:8007::69: icmp_seq=1 ttl=57 time=20.7 ms 64 bytes from 2a00:1450:8007::69: icmp_seq=2 ttl=57 time=20.2 ms 64 bytes from 2a00:1450:8007::69: icmp_seq=3 ttl=57 time=23.4 ms ^C --- ipv6.google.com ping statistics --- 3 packets transmitted, 3 received, 0% packet loss, time 2001ms rtt min/avg/max/mdev = 20.267/21.479/23.413/1.392 ms

    Read the article

  • Problems setting up VLC Sever/client streaming

    - by Ayos
    I'm trying to set up a Linux machine as the server and a Windows XP machine as the client. Both machines are connected to the same local network via a Wi-Fi router. I setup the stream with the following properties : http stream port 8080 play locally And not much else. No firewall on the windows client(Windows firewall is disabled) When I try to open network stream via the client machine(Using VLC or Windows Media Player) I get the following errors: Media Player error code : 0xC00D11B3: Encountered a network Problem. VLC Console: main warning: connection timed out access_mms error: cannot connect to 192.168.1.3:8080 main debug: no access module matching "http" could be loaded main debug: TIMER module_need() : 12625.810 ms - Total 12625.810 ms / 1 intvls (Avg 12625.809 ms) main error: open of `http://192.168.1.3:8080' failed main debug: dead input main debug: repeating item main debug: starting playback of the new playlist item main debug: resyncing on http://192.168.1.3:8080 main debug: http://192.168.1.3:8080 is at 0 main debug: creating new input thread main debug: Creating an input for 'http://192.168.1.3:8080' main debug: using timeshift granularity of 50 MiB, in path 'C:\DOCUME~1\Accer\LOCALS~1\Temp' main debug: `http://192.168.1.3:8080' gives access `http' demux `' path `192.168.1.3:8080' main debug: creating demux: access='http' demux='' location='192.168.1.3:8080' file='\\192.168.1.3:8080' main debug: looking for access_demux module: 0 candidates main debug: no access_demux module matched "http" main debug: TIMER module_need() : 0.461 ms - Total 0.461 ms / 1 intvls (Avg 0.461 ms) main debug: creating access 'http' location='192.168.1.3:8080', path='\\192.168.1.3:8080' main debug: looking for access module: 2 candidates access_http debug: http: server='192.168.1.3' port=8080 file='' main debug: net: connecting to 192.168.1.3 port 8080 qt4 debug: IM: Deleting the input main debug: TIMER input launching for 'http://192.168.1.3:8080' : 13397.979 ms - Total 13397.979 ms / 1 intvls (Avg 13397.978 ms) qt4 debug: IM: Setting an input Need Help. Thanks in advance.

    Read the article

  • How to add the Windows defender(MS essential) in Windows explorer right click menu to scan a particular drive/folder on demand?

    - by avirk
    As we have inbuilt antivirus like Windows defender in Windows 8 now, I called it antivirus because it has embedded option of MS essential as well. But there is no option to scan a particular drive on demand by right click on it in windows explore as we had in Windows 7 with MS essential or like other antiviruses. I know we can run a custom scan for the particular drive or specific folder but that process is too lengthy and time consuming. This guide explain that how we can add the Windows Defender in the desktop right click menu, so I'm curious is there a way to add it in the Windows explorer right click menu to launch a search whenever I need to.

    Read the article

  • Is MS Forefront Add-in for Exchange server detecting HTML/Redirector.C incorrectly?

    - by rhart
    Users of a website hosted by our organization occasionally send complaints that our registration confirmation emails are infected with HTML/Redirector.C. They are always using an MS Exchange Server with the MS Forefront for Exchange AV add-in. The thing is, I don't think the detection is legitimate. I think the issue is that the link in the email we send causes a redirect. I should point out that this is done for a legitimate purpose. :) Has anybody run into this before? Naturally, Microsoft provides absolutely no good information on this one: http://www.microsoft.com/security/portal/Threat/Encyclopedia/Entry.aspx?Name=Trojan%3aHTML%2fRedirector.C&ThreatID=-2147358338 I can't find any other explanation of HTML/Redirector.C on the Internet either. If anyone knows of a real description for this virus that would be greatly appreciated as well.

    Read the article

< Previous Page | 88 89 90 91 92 93 94 95 96 97 98 99  | Next Page >