Search Results

Search found 23347 results on 934 pages for 'salesforce service cloud'.

Page 213/934 | < Previous Page | 209 210 211 212 213 214 215 216 217 218 219 220  | Next Page >

  • ArchBeat Link-o-Rama for November 15, 2012

    - by Bob Rhubart
    WLST Starting and Stopping a WebLogic Environment | Rene van Wijk Oracle ACE Rene van Wijk explores how to start a server with as little input as possible. Developing and Enforcing a BYOD Policy | Darin Pendergraft Darin Pendergraft's post includes links to a recent Mobile Access Policy Survey by SANS as well as registration information for a Nov 15 webcast featuring security expert Tony DeLaGrange from Secure Ideas, SANS instructor, attorney and technology law expert Ben Wright, and Oracle IDM product manager Lee Howarth. Cloud Integration White Paper Now Available |Bruce Tierney Bruce Tierney shares an overview of Cloud Integration - A Comprehensive Solution, a new white paper he co-authored with David Baum, Rajesh Raheja, Bruce Tierney, and Vijay Pawar. My iPad & This Cloud Thing | Floyd Teter Oracle ACE Director Floyd Teter explains why the Cloud is making it possible for him to use his iPad for tasks previously relegated to his laptop, and why this same scenario is likely to play out for a great many people. 3 steps to a cloud database strategy that works | InfoWorld "Every day, cloud-based databases add more features, decrease in cost, and become better at handling prime-time business," says InfoWorld blogger David Linthicum. "However, enterprise IT is reluctant to move data to public clouds, citing the tried-and-true excuses of security, privacy, and compliance. Although some have valid points, their reasons often boil down to 'I don't wanna.'" Oracle VM Templates for EBS 12.1.3 for Exalogic Now Available | Elke Phelps "The templates contain all the required elements to create an Oracle E-Business Suite R12 demonstration system on an Exalogic server," says Elke Phelps. "You can use these templates to quickly build an EBS 12.1.3 demonstration environment, bypassing the operating system and the software install (via the EBS Rapid Install)." Thought for the Day "A good plan executed today always beats a perfect plan executed tomorrow." — George S. Patton (November 11, 1885 - December 21, 1945) Source: SoftwareQuotes.com

    Read the article

  • ArchBeat Link-o-Rama for 11/18/2011

    - by Bob Rhubart
    IT executives taking lead role with both private and public cloud projects: survey | Joe McKendrick "The survey, conducted among members of the Independent Oracle Users Group, found that both private and public cloud adoption are up—30% of respondents report having limited-to-large-scale private clouds, up from 24% only a year ago. Another 25% are either piloting or considering private cloud projects. Public cloud services are also being adopted for their enterprises by more than one out of five respondents." - Joe McKendrick SOA all the Time; Architects in AZ; Clearing Info Integration Hurdles This week on the Architect Home Page on OTN. OIM 11g OID (LDAP) Groups Request-Based Provisioning with custom approval – Part I | Alex Lopez Iin part one of a two-part blog post, Alex Lopez illustrates "an implementation of a Custom Approval process and a Custom UI based on ADF to request entitlements for users which in turn will be converted to Group memberships in OID." ArchBeat Podcast Information Integration - Part 3/3 "Oracle Information Integration, Migration, and Consolidation" author Jason Williamson, co-author Tom Laszeski, and book contributor Marc Hebert talk about upcoming projects and about what they've learned in writing their book. InfoQ: Enterprise Shared Services and the Cloud | Ganesh Prasad As an industry, we have converged onto a standard three-layered service model (IaaS, PaaS, SaaS) to describe cloud computing, with each layer defined in terms of the operational control capabilities it offers. This is unlike enterprise shared services, which have unique characteristics around ownership, funding and operations, and they span SaaS and PaaS layers. Ganesh Prasad explores the differences. Stress Testing Oracle ADF BC Applications - Do Connection Pooling and TXN Disconnect Level Oracle ACE Director Andrejus Baranovskis describes "how jbo.doconnectionpooling = true and jbo.txn.disconnect_level = 1 properties affect ADF application performance." Exploring TCP throughput with DTrace | Alan Maguire "According to the theory," says Maguire, "when the number of unacknowledged bytes for the connection is less than the receive window of the peer, the path bandwidth is the limiting factor for throughput."

    Read the article

  • ArchBeat Link-o-Rama for 2012-06-21

    - by Bob Rhubart
    Software Architects Need Not Apply | Dustin Marx "I think there is a place for software architecture," says Dustin Marx, "but a portion of our fellow software architects have harmed the reputation of the discipline." For another angle on this subject, check out Out of the Tower, Into the Trenches from the Nov/Dec edition of Oracle Magazine. Oracle Data Integrator 11g - Faster Files | David Allan David Allan illustrates "a big step for regular file processing on the way to super-charging big data files using Hadoop." 2012 Oracle Fusion Middleware Innovation Awards - Win a FREE Pass to Oracle OpenWorld 2012 in SF Share your use of Oracle Fusion Middleware solutions and how they help your organization drive business innovation. You just might win a free pass to Oracle Openworld 2012 in San Francisco. Deadline for submissions in July 17, 2012. WLST Domain creation using dry-run | Michel Schildmeijer What to do "if you want to browse through your domain to check if settings you want to apply satisfy your requirements." Cloud opens up new vistas for service orientation at Netflix | Joe McKendrick "Many see service oriented architecture as laying the groundwork for cloud. But at one well-known company, cloud has instigated the move to SOA." How to avoid the Portlet Skin mismatch | Martin Deh Detailed how-to from WebCenter A-Team blogger Martin Deh. Internationalize WebCenter Portal - Content Presenter | Stefan Krantz Stefan Krantz explains "how to get Content Presenter and its editorials to comply with the current selected locale for the WebCenter Portal session." Oracle Public Cloud Architecture | Tyler Jewell Tyler Jewell discusses the multi-tenancy model and elasticity solution implemented by Oracle Cloud in this QCon presentation. A Distributed Access Control Architecture for Cloud Computing The authors of this InfoQ article discuss a distributed architecture based on the principles from security management and software engineering. Thought for the Day "Let us change our traditional attitude to the construction of programs. Instead of imagining that our main task is to instruct a computer what to to, let us concentrate rather on explaining to human beings what we want a computer to do." — Donald Knuth Source: Quotes for Software Engineers

    Read the article

  • Additional new content SOA Partner Community

    - by JuergenKress
    Oracle Reference Architecture: Application Infrastructure Foundation One of the earliest additions to the IT Strategies from Oracle library, this paper describes the concepts and capabilities of the application infrastructure and defines the platform on which solutions are built. Read it. Scaling Service Oriented Architecture What is scaling, and what does it mean to a service oriented architecture? Author Philip Wik explores those issues and proposes Oracle-based solutions to SOA scaling and a SOA scaling roadmap. Read it. SOA, Cloud, and Service Technologies: A Conversation with Thomas Erl Thomas Erl, the world's best selling SOA author, is joined by Oracle SOA experts Tim Hall and Demed L'Her for a wide ranging four-part conversation on the evolution of SOA and the emergence of the architect in the era of cloud computing. Listen to the Podcast & Read a Transcript Cloud e-book Invite your customers to download this Cloud e-book, packed with multi-media resources to educate your customers on the value of Oracle Cloud computing. Assessment: Are you Leading or Lagging when it comes to SOA and BPM? Take the online SOA Assessment and BPM Assessment. New Collateral: Whitepaper Series: The Promise of BPM Technology for Financial Services Institutions - Resource Kit Whitepaper: Reaching Process Excellence with Process Accelerators - PDF Demystifying Cloud Integration: Whitepapers, webcasts, and customer case studies - Resource Kit Whitepaper: Leveraging Governance to sustain Enterprise Architecture - PDF Article: Rethink SOA: A Recipe for Business Transformation - Article Oracle SOA Resource Kit Oracle SOA Governance Resource Kit Oracle BPM Resource Kit SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit  www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Mix Forum Technorati Tags: SOA Community,Oracle SOA,Oracle BPM,Community,OPN,Jürgen Kress

    Read the article

  • Why Wouldn't Root Be Able to Change a Zone's IP Address in Oracle Solaris 11?

    - by rickramsey
    You might assume that if you have root access to an Oracle Solaris zone, you'd be able to change the root's IP address. If so, you'd proceed along these lines ... First, you'd log in: root@global_zone:~# zlogin user-zone Then you'd remove the IP interface: root@user-zone:~# ipadm delete-ip vnic0 Next, you'd create a new IP interface: root@user-zone:~# ipadm create-ip vnic0 Then you'd assign the IP interface a new IP address (10.0.0.10): root@user-zone:~# ipadm create-addr -a local=10.0.0.10/24 vnic0/v4 ipadm: cannot create address: Permission denied Why would that happen? Here are some potential reasons: You're in the wrong zone Nobody bothered to tell you that you were fired last week. The sysadmin for the global zone (probably your ex-girlfriend) enabled link protection mode on the zone with this sweet little command: root@global_zone:~# dladm set-linkprop -p \ protection=mac-nospoof,restricted,ip-nospoof vnic0 How'd your ex-girlfriend learn to do that? By reading this article: Securing a Cloud-Based Data Center with Oracle Solaris 11 by Orgad Kimchi, Ron Larson, and Richard Friedman When you build a private cloud, you need to protect sensitive data not only while it's in storage, but also during transmission between servers and clients, and when it's being used by an application. When a project is completed, the cloud must securely delete sensitive data and make sure the original data is kept secure. These are just some of the many security precautions a sysadmin needs to take to secure data in a cloud infrastructure. Orgad, Ron, and Richard and explain the rest and show you how to employ the security features in Oracle Solaris 11 to protect your cloud infrastructure. Part 2 of a three-part article on cloud deployments that use the Oracle Solaris Remote Lab as a case study. About the Photograph That's the fence separating a small group of tourist cabins from a pasture in the small town of Tropic, Utah. Follow Rick on: Personal Blog | Personal Twitter | Oracle Forums   Follow OTN Garage on: Web | Facebook | Twitter | YouTube

    Read the article

  • Can't Miss Event: Oracle Coherence 12c Launch Webcast

    - by jeckels
    We're super-excited around here about the impending launch of Oracle Coherence 12c as part of the Cloud Application Foundation launch this month! We want you to join us for the Cloud Application Foundation launch event to learn more about Coherence's ability to deliver applications with a mission-critical cloud platform, enhance deployment options for high availability and simplify operations with integrated products and management. Scale your applications to meet mobile and cloud demands! Oracle Cloud Application Foundation Launch Including Oracle WebLogic Server, Oracle Coherence, Oracle Enterprise Manager and Oracle Development ToolsJuly 31st, 2013 10am Pacific Time >> Register now! (of course, it's free) This will be the first release of Coherence we're making available at the same time as an Oracle WebLogic Server release - and that's not a coincidence. One of the main focus areas of this launch is the operational simplicity that we want you to enjoy, and that includes a tight integration not only with WebLogic Server itself, but also with cloud management tools (Enterprise Manager) and developer technologies - like JDeveloper, Eclipse tools, ADF Mobile and more - to ensure you can be productive out of the box on day one. The word is, there are even some heavy-duty capabilities Coherence will be delivering around real-time data processing, elastic scalability, developer technology friendliness and even some deep integration with Oracle Database 12c, which is launching on July 10th. But, we're already giving away too much. We look forward to seeing you there!

    Read the article

  • How to document requirements for an API systematically?

    - by Heinrich
    I am currently working on a project, where I have to analyze the requirements of two given IT systems, that use cloud computing, for a Cloud API. In other words, I have to analyze what requirements these systems have for a Cloud API, such that they would be able to switch it, while being able to accomplish their current goals. Let me give you an example for some informal requirements of Project A: When starting virtual machines in the cloud through the API, it must be possible to specify the memory size, CPU type, operating system and a SSH key for the root user. It must be possible to monitor the inbound and outbound network traffic per hour per virtual machine. The API must support the assignment of public IPs to a virtual machine and the retrieval of the public IPs. ... In a later stage of the project I will analyze some Cloud Computing standards that standardize cloud APIs to find out where possible shortcomings in the current standards are. A finding could and will probably be, that a certain standard does not support monitoring resource usage and thus is not currently usable. I am currently trying to find a way to systematically write down and classify my requirements. I feel that the way I currently have them written down (like the three points above) is too informal. I have read in a couple of requirements enineering and software architecture books, but they all focus too much on details and implementation. I do really only care about the functionalities provided through the API/interface and I don't think UML diagrams etc. are the right choice for me. I think currently the requirements that I collected can be described as user stories, but is that already enough for a sophisticated requirements analysis? Probably I should go "one level deeper" ... Any advice/learning resources for me?

    Read the article

  • ArchBeat Link-o-Rama for December 12, 2012

    - by Bob Rhubart
    “Cloud Integration in Minutes” – True or False? | Bruce Tierney The answer is 'True, but..." according to Bruce Tierney. "Connecting on-premise and cloud applications “in minutes” is true…provided you only consider the connectivity subset of integration and have a small number of cloud integration touch points." Get the rest of the story in Bruce's detailed post. Tech World Discovers New Species: The Cloud Architect | Wired Enterprise | Wired.com This Wired article by Cade Metz boils down to one essential conclusion: Cloud computing is a significant departure from "data center designs of the past," and the demand for the specialized skills of the cloud architect will only increase. But you already knew that, right? Oracle B2B - Synchronous Request Reply | A-Team - SOA "Beginning with Oracle SOA Suite PS5 (11.1.1.6), B2B supports synchronous request reply over http using the b2b/syncreceiver servlet," says C. D. Wright of the Fusion Middleware A-Team. His post includes a demo and everything you need to run it. Thought for the Day "Don't worry about what anybody else is going to do… The best way to predict the future is to invent it." — Alan Kay (Month Day, Year - Month Day, Year) Source: SoftwareQuotes.com

    Read the article

  • Failed to install ntp via apt-get in Debian

    - by Petah
    When trying to install ntp (because my server clock is wrong), it just pukes this massive error. Any idea how to fix this? root@pan-prodweb01:~# apt-get install ntp Reading package lists... Done Building dependency tree Reading state information... Done ntp is already the newest version. 0 upgraded, 0 newly installed, 0 to remove and 75 not upgraded. 1 not fully installed or removed. After this operation, 0 B of additional disk space will be used. Do you want to continue [Y/n]? Y Setting up ntp (1:4.2.6.p2+dfsg-1+b1) ... insserv: warning: script 'S99obmaua' missing LSB tags and overrides insserv: warning: script 'S99obmscheduler' missing LSB tags and overrides insserv: warning: script 'obmscheduler' missing LSB tags and overrides insserv: warning: script 'obmaua' missing LSB tags and overrides insserv: There is a loop between service stop-bootlogd and mountnfs if started insserv: loop involving service mountnfs at depth 8 insserv: loop involving service nfs-common at depth 7 insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$aconfigured to not write apport reports ll' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmaua depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Starting obmscheduler depends on stop-bootlogd and therefore on system facility `$all' which can not be true! insserv: Max recursions depth 99 reached insserv: loop involving service tomcat6 at depth 9 insserv: There is a loop between service stop-bootlogd and mountall if started insserv: loop involving service mountall at depth 4 insserv: loop involving service checkfs at depth 3 insserv: loop involving service mountnfs-bootclean at depth 10 insserv: loop involving service networking at depth 6 insserv: There is a loop between service stop-bootlogd and checkroot if started insserv: loop involving service checkroot at depth 5 insserv: loop involving service hostname at depth 4 insserv: loop involving service kbd at depth 12 insserv: loop involving service module-init-tools at depth 6 insserv: There is a loop between service stop-bootlogd and mountoverflowtmp if started insserv: loop involving service mountoverflowtmp at depth 9 insserv: loop involving service mountall-bootclean at depth 8 insserv: There is a loop at service obmaua if started insserv: There is a loop between service obmaua and ifupdown-clean if started insserv: loop involving service ifupdown-clean at depth 6 insserv: There is a loop at service stop-bootlogd if started insserv: loop involving service obmaua at depth 1 insserv: loop involving service mtab at depth 7 insserv: exiting now without changing boot order! update-rc.d: error: insserv rejected the script header dpkg: error processing ntp (--configure): subprocess installed post-installation script returned error exit status 1 Errors were encountered while processing: ntp localepurge: Disk space freed in /usr/share/locale: 0 KiB localepurge: Disk space freed in /usr/share/man: 0 KiB Total disk space freed by localepurge: 0 KiB E: Sub-process /usr/bin/dpkg returned an error code (1)

    Read the article

  • Using jQuery and OData to Insert a Database Record

    - by Stephen Walther
    In my previous blog entry, I explored two ways of inserting a database record using jQuery. We added a new Movie to the Movie database table by using a generic handler and by using a WCF service. In this blog entry, I want to take a brief look at how you can insert a database record using OData. Introduction to OData The Open Data Protocol (OData) was developed by Microsoft to be an open standard for communicating data across the Internet. Because the protocol is compatible with standards such as REST and JSON, the protocol is particularly well suited for Ajax. OData has undergone several name changes. It was previously referred to as Astoria and ADO.NET Data Services. OData is used by Sharepoint Server 2010, Azure Storage Services, Excel 2010, SQL Server 2008, and project code name “Dallas.” Because OData is being adopted as the public interface of so many important Microsoft technologies, it is a good protocol to learn. You can learn more about OData by visiting the following websites: http://www.odata.org http://msdn.microsoft.com/en-us/data/bb931106.aspx When using the .NET framework, you can easily expose database data through the OData protocol by creating a WCF Data Service. In this blog entry, I will create a WCF Data Service that exposes the Movie database table. Create the Database and Data Model The MoviesDB database is a simple database that contains the following Movies table: You need to create a data model to represent the MoviesDB database. In this blog entry, I use the ADO.NET Entity Framework to create my data model. However, WCF Data Services and OData are not tied to any particular OR/M framework such as the ADO.NET Entity Framework. For details on creating the Entity Framework data model for the MoviesDB database, see the previous blog entry. Create a WCF Data Service You create a new WCF Service by selecting the menu option Project, Add New Item and selecting the WCF Data Service item template (see Figure 1). Name the new WCF Data Service MovieService.svc. Figure 1 – Adding a WCF Data Service Listing 1 contains the default code that you get when you create a new WCF Data Service. There are two things that you need to modify. Listing 1 – New WCF Data Service File using System; using System.Collections.Generic; using System.Data.Services; using System.Data.Services.Common; using System.Linq; using System.ServiceModel.Web; using System.Web; namespace WebApplication1 { public class MovieService : DataService< /* TODO: put your data source class name here */ > { // This method is called only once to initialize service-wide policies. public static void InitializeService(DataServiceConfiguration config) { // TODO: set rules to indicate which entity sets and service operations are visible, updatable, etc. // Examples: // config.SetEntitySetAccessRule("MyEntityset", EntitySetRights.AllRead); // config.SetServiceOperationAccessRule("MyServiceOperation", ServiceOperationRights.All); config.DataServiceBehavior.MaxProtocolVersion = DataServiceProtocolVersion.V2; } } } First, you need to replace the comment /* TODO: put your data source class name here */ with a class that represents the data that you want to expose from the service. In our case, we need to replace the comment with a reference to the MoviesDBEntities class generated by the Entity Framework. Next, you need to configure the security for the WCF Data Service. By default, you cannot query or modify the movie data. We need to update the Entity Set Access Rule to enable us to insert a new database record. The updated MovieService.svc is contained in Listing 2: Listing 2 – MovieService.svc using System.Data.Services; using System.Data.Services.Common; namespace WebApplication1 { public class MovieService : DataService<MoviesDBEntities> { public static void InitializeService(DataServiceConfiguration config) { config.SetEntitySetAccessRule("Movies", EntitySetRights.AllWrite); config.DataServiceBehavior.MaxProtocolVersion = DataServiceProtocolVersion.V2; } } } That’s all we have to do. We can now insert a new Movie into the Movies database table by posting a new Movie to the following URL: /MovieService.svc/Movies The request must be a POST request. The Movie must be represented as JSON. Using jQuery with OData The HTML page in Listing 3 illustrates how you can use jQuery to insert a new Movie into the Movies database table using the OData protocol. Listing 3 – Default.htm <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <head> <title>jQuery OData Insert</title> <script src="http://ajax.microsoft.com/ajax/jquery/jquery-1.4.2.js" type="text/javascript"></script> <script src="Scripts/json2.js" type="text/javascript"></script> </head> <body> <form> <label>Title:</label> <input id="title" /> <br /> <label>Director:</label> <input id="director" /> </form> <button id="btnAdd">Add Movie</button> <script type="text/javascript"> $("#btnAdd").click(function () { // Convert the form into an object var data = { Title: $("#title").val(), Director: $("#director").val() }; // JSONify the data var data = JSON.stringify(data); // Post it $.ajax({ type: "POST", contentType: "application/json; charset=utf-8", url: "MovieService.svc/Movies", data: data, dataType: "json", success: insertCallback }); }); function insertCallback(result) { // unwrap result var newMovie = result["d"]; // Show primary key alert("Movie added with primary key " + newMovie.Id); } </script> </body> </html> jQuery does not include a JSON serializer. Therefore, we need to include the JSON2 library to serialize the new Movie that we wish to create. The Movie is serialized by calling the JSON.stringify() method: var data = JSON.stringify(data); You can download the JSON2 library from the following website: http://www.json.org/js.html The jQuery ajax() method is called to insert the new Movie. Notice that both the contentType and dataType are set to use JSON. The jQuery ajax() method is used to perform a POST operation against the URL MovieService.svc/Movies. Because the POST payload contains a JSON representation of a new Movie, a new Movie is added to the database table of Movies. When the POST completes successfully, the insertCallback() method is called. The new Movie is passed to this method. The method simply displays the primary key of the new Movie: Summary The OData protocol (and its enabling technology named WCF Data Services) works very nicely with Ajax. By creating a WCF Data Service, you can quickly expose your database data to an Ajax application by taking advantage of open standards such as REST, JSON, and OData. In the next blog entry, I want to take a closer look at how the OData protocol supports different methods of querying data.

    Read the article

  • Using JCA Adapter with OSB 11.1.1.3

    - by James Taylor
    In OSB 10g to use the JCA adapters you were required to use JDeveloper to create the necessary WSDLs and XSDs etc using the associated adapter wizard. These files were imported into Oracle Workshop (Eclipse) and used to create the business service as you would any other web service. In 11g unfortunately JDeveloper is still required. The process has changed slightly as described below. As an example I have used the JCA DB adapter as an example. Start JDeveloper 11.1.1.3 Create a new SOA Application Create a new SOA Project and call it DBAdapters. Choose the Empty Composite Template Drag a Database Adapter Component to the External References panel on the composite. Provide a service name. Create a new database connection, or use an existing one Take note of the JNDI Name, e.g. eis/DB/MyConnection This will be used to configure the DB connection in the WebLogic Console. In my example I use a stored procedure, but you can use what ever operation you require. Please refer to the following link for other options: User's Guide for Technology Adapters Select a schema and stored procedure Once the procedure has been selected, accept the defaults and finish. Startup your OEPE version of Eclipse. Create a new Oracle Service Bus Configuration Project (you can use an existing project if you have one) Create a new Oracle Service Bus Project in the configuration project created above. Instead of importing the WSDL and XSD files you import the jca file created in JDeveloper. In Eclipse right click the Oracle Service Bus Project and select Import –> Import    Choose File System Browse to the directory where JDeveloper stores its project Select the jca, wsdl, and xsd files based on the service you created in step 5. Also check the ‘Create selected folders only’ radio button. When you import you may have a little red x indicating the files are invalid. This is due to the location of the files. Open the invalid files and fix the path in relation to where you store your files in the OSB project.   Once you have the files all valid, Right-Click the jca file and select Oracle Service Bus –> Generate Service. This will create a new Business Service. In the WebLogic Console configure the JNDI name defined in step 7. You can now deploy your project and test

    Read the article

  • Multiple vulnerabilities in Adobe Flashplayer

    - by chandan
    CVE DescriptionCVSSv2 Base ScoreComponentProduct and Resolution CVE-2012-0724 Denial of Service (DoS) vulnerability 10.0 Adobe Flashplayer Solaris 10 SPARC: 125332-24 X86: 125333-23 CVE-2012-0725 Denial of Service (DoS) vulnerability 10.0 CVE-2012-0768 Denial of Service(DoS) vulnerability 10.0 CVE-2012-0769 Information disclosure vulnerability 5.0 CVE-2012-0772 Denial of Service (DoS) vulnerability 10.0 CVE-2012-0773 Denial of Service (DoS) vulnerability 10.0 This notification describes vulnerabilities fixed in third-party components that are included in Oracle's product distributions.Information about vulnerabilities affecting Oracle products can be found on Oracle Critical Patch Updates and Security Alerts page.

    Read the article

  • Common SOA Problems by C2B2

    - by JuergenKress
    SOA stands for Service Oriented Architecture and has only really come together as a concrete approach in the last 15 years or so, although the concepts involved have been around for longer. Oracle SOA Suite is based around the Service Component Architecture (SCA) devised by the Open SOA collaboration of companies including Oracle and IBM. SCA, as used in SOA suite, is designed as a way to crystallise the concepts of SOA into a standard which ensures that SOA principles like the separation of application and business logic are maintained. Orchestration or Integration? A common thing to see with many people who are beginning to either build a new SOA based infrastructure, or move an old system to be service oriented, is confusion in the purpose of SOA technologies like BPEL and enterprise service buses. For a lot of problems, orchestration tools like BPEL or integration tools like an ESB will both do the job and achieve the right objectives; however it’s important to remember that, although a hammer can be used to drive a screw into wood, that doesn’t mean it’s the best way to do it. Service Integration is the act of connecting components together at a low level, which usually results in a single external endpoint for you to expose to your customers or other teams within your organisation – a simple product ordering system, for example, might integrate a stock checking service and a payment processing service. Process Orchestration, however, is generally a higher level approach whereby the (often externally exposed) service endpoints are brought together to track an end-to-end business process. This might include the earlier example of a product ordering service and couple it with a business rules service and human task to handle edge-cases. A good (but not exhaustive) rule-of-thumb is that integrations performed by an ESB will usually be real-time, whereas process orchestration in a SOA composite might comprise processes which take a certain amount of time to complete, or have to wait pending manual intervention. BPEL vs BPMN For some, with pre-existing SOA or business process projects, this decision is effectively already made. For those embarking on new projects it’s certainly an important consideration for those using Oracle SOA software since, due to the components included in SOA Suite and BPM Suite, the choice of which to buy is determined by what they offer. Oracle SOA suite has no BPMN engine, whereas BPM suite has both a BPMN and a BPEL engine. SOA suite has the ESB component “Mediator”, whereas BPM suite has none. Decisions must be made, therefore, on whether just one or both process modelling languages are to be used. The wrong decision could be costly further down the line. Design for performance: Read the complete article here. SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Facebook Wiki Technorati Tags: C2B2,SOA best practice,SOA Community,Oracle SOA,Oracle BPM,Community,OPN,Jürgen Kress

    Read the article

  • Common SOA Problems by C2B2

    - by JuergenKress
    SOA stands for Service Oriented Architecture and has only really come together as a concrete approach in the last 15 years or so, although the concepts involved have been around for longer. Oracle SOA Suite is based around the Service Component Architecture (SCA) devised by the Open SOA collaboration of companies including Oracle and IBM. SCA, as used in SOA suite, is designed as a way to crystallise the concepts of SOA into a standard which ensures that SOA principles like the separation of application and business logic are maintained. Orchestration or Integration? A common thing to see with many people who are beginning to either build a new SOA based infrastructure, or move an old system to be service oriented, is confusion in the purpose of SOA technologies like BPEL and enterprise service buses. For a lot of problems, orchestration tools like BPEL or integration tools like an ESB will both do the job and achieve the right objectives; however it’s important to remember that, although a hammer can be used to drive a screw into wood, that doesn’t mean it’s the best way to do it. Service Integration is the act of connecting components together at a low level, which usually results in a single external endpoint for you to expose to your customers or other teams within your organisation – a simple product ordering system, for example, might integrate a stock checking service and a payment processing service. Process Orchestration, however, is generally a higher level approach whereby the (often externally exposed) service endpoints are brought together to track an end-to-end business process. This might include the earlier example of a product ordering service and couple it with a business rules service and human task to handle edge-cases. A good (but not exhaustive) rule-of-thumb is that integrations performed by an ESB will usually be real-time, whereas process orchestration in a SOA composite might comprise processes which take a certain amount of time to complete, or have to wait pending manual intervention. BPEL vs BPMN For some, with pre-existing SOA or business process projects, this decision is effectively already made. For those embarking on new projects it’s certainly an important consideration for those using Oracle SOA software since, due to the components included in SOA Suite and BPM Suite, the choice of which to buy is determined by what they offer. Oracle SOA suite has no BPMN engine, whereas BPM suite has both a BPMN and a BPEL engine. SOA suite has the ESB component “Mediator”, whereas BPM suite has none. Decisions must be made, therefore, on whether just one or both process modelling languages are to be used. The wrong decision could be costly further down the line. Design for performance: Read the complete article here. SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Facebook Wiki Technorati Tags: C2B2,SOA best practice,SOA Community,Oracle SOA,Oracle BPM,Community,OPN,Jürgen Kress

    Read the article

  • Constant CMS Session Expiry On 1&1 Cloud Server?

    - by leen3o
    I have a couple of 1&1's 'Dynamic Cloud Servers' and running Win2008R2 and they are setup as web servers, I have a number of Umbraco CMS installs on them and they have been running fine for over a year. On Saturday on BOTH servers, a very strange thing happened - As soon as I login to the CMS/Umbraco admin I am logged out with about 5 seconds? It's as if my session expires the moment I login? I have checked everything I can as I'm not really a server admin, and everything seems to be exactly as it was last week? Like I say this has happened EXACTLY the same time (Saturday) on TWO different servers? I'm just looking for ideas of what I should be looking for? Also the front end of the sites seem fine... Its only the backend when I login. I have gone to 1&1 about this, and as usual they have washed their hands saying its nothing to do with them - When I am certain it is. How can this happen on two different servers, and affect the same sites in exactly the same way? Any help, tips, things to try would be greatly appreciated.

    Read the article

  • Looking for suggestions for hosting Windows 2000 Server in the cloud / VPS / etc?

    - by JohnyD
    I have a Windows 2000 Server, currently virtualized in Hyper-V, that I would like to get running off-site as a backup (cloud, VPS, etc). You can't virtualize in EC2 and I'm fairly certain there are no Server 2000 AMI's floating about (correct me if I'm wrong!). If anyone has a recommendation on how I can get a virtualized Windows 2000 Server running in a secure, remote environment I would be grateful. As far as locations go I'd be interested in both North America as well as Australia and Europe. In a nutshell, we're ploughing our way out of a legacy codebase and this server is the last that remains of the legacy apps. However, it is still very much used by our clients. Everything is backed up each night (data, images, etc) to tape which is then taken offsite. However, in the event of a fire I would love to have a backup legacy server to point DNS records to. So while I am rebuilding from the ashes our services would already be available. It would save a lot of time and make my managers all the more happy (and that's what it's all about, riighhtt? :D) Thank you all for your suggestions. Please let me know if I've left out any important information. Additional info: - the legacy codebase does not function properly in Server 2003

    Read the article

  • Whats the best cloud backup solution for a small scale server envoirnment?

    - by nbv4
    I have a server that runs a postgres database that contains about 200MB of data. Currently I have a cron job setup on my home computer which: ssh's into my server runs a remote script which makes a backup of the database scp's that dump over to my local hard drive for storage. Each dump is 20MiB. does this every six hours (one months of backups is roughly 2GiB) The problem with this setup is that if my local machine goes down for whatever reason, no backups will be made. Also, I can't have the cron run from the server, because I can't have it scp'd to my local machine from my server (firewalls and all that crap). My local machine is running Ubuntu 10.04, and my server is Ubuntu 9.10 server edition. I looked into Ubuntu One, but currently it's gui-only. I also looked into dropbox, but it's a pain in the ass to get setup in linux without gui support. Amazon S3 looks good but it's not free (yet dirt cheap). Is there any other alternative that I should look into? I'd prefer something where I can just have my script dump the database into a directory, and have the backup service 'watch' that folder and sync accordingly. I can maybe also have my local machine sync to the cloud backup so I have even more redundancy, plus easy access to my backups for use in testing.

    Read the article

  • Whats the best cloud backup solution for a small scale server environment?

    - by nbv4
    I have a server that runs a postgres database that contains about 200MB of data. Currently I have a cron job setup on my home computer which: ssh's into my server runs a remote script which makes a backup of the database scp's that dump over to my local hard drive for storage. Each dump is 20MiB. does this every six hours (one months of backups is roughly 2GiB) The problem with this setup is that if my local machine goes down for whatever reason, no backups will be made. Also, I can't have the cron run from the server, because I can't have it scp'd to my local machine from my server (firewalls and all that crap). My local machine is running Ubuntu 10.04, and my server is Ubuntu 9.10 server edition. I looked into Ubuntu One, but currently it's gui-only. I also looked into dropbox, but it's a pain in the ass to get setup in linux without gui support. Amazon S3 looks good but it's not free (yet dirt cheap). Is there any other alternative that I should look into? I'd prefer something where I can just have my script dump the database into a directory, and have the backup service 'watch' that folder and sync accordingly. I can maybe also have my local machine sync to the cloud backup so I have even more redundancy, plus easy access to my backups for use in testing. Edit: My server is a VPS, so what solution I end up using has to be 100% software based.

    Read the article

  • How to fix “The requested service, ‘net.pipe://localhost/SecurityTokenServiceApplication/appsts.svc’ could not be activated.”

    - by ybbest
    Problem: When I try to publish a SharePoint2013 workflow, I received the error: The requested service, ‘net.pipe://localhost/SecurityTokenServiceApplication/appsts.svc’ could not be activated. After that, my workflow stopped working and every time I start a work I receive the following error message: System.ApplicationException: PreconditionFailed ---> System.ApplicationException: Error in the application. --- End of inner exception stack trace --- at System.Activities.Statements.Throw.Execute(CodeActivityContext context) at System.Activities.CodeActivity.InternalExecute(ActivityInstance instance, ActivityExecutor executor, BookmarkManager bookmarkManager) at System.Activities.Runtime.ActivityExecutor.ExecuteActivityWorkItem.ExecuteBody(ActivityExecutor executor, BookmarkManager bookmarkManager, Location resultLocation) Analysis: After analysis, I found the error by visiting the http://localhost:32843/SecurityTokenServiceApplication/securitytoken.svc and the error I got on the message is                                                                                                                                              Solution: The solution is basically getting more memory to the server. For development environment, you can restart your noderunner.exe or some other services to release some memories. To verify you have enough memory    you can browse to http://localhost:32843/SecurityTokenServiceApplication/securitytoken.svc , it should return the information below. Then you can republish your workflow and it will work like a charm.

    Read the article

  • How to fix “The requested service, ‘net.pipe://localhost/SecurityTokenServiceApplication/appsts.svc’ could not be activated.”

    - by ybbest
    Problem: When I try to publish a SharePoint2013 workflow, I received the error: The requested service, ‘net.pipe://localhost/SecurityTokenServiceApplication/appsts.svc’ could not be activated. After that, my workflow stopped working and every time I start a work I receive the following error message: System.ApplicationException: PreconditionFailed ---> System.ApplicationException: Error in the application. --- End of inner exception stack trace --- at System.Activities.Statements.Throw.Execute(CodeActivityContext context) at System.Activities.CodeActivity.InternalExecute(ActivityInstance instance, ActivityExecutor executor, BookmarkManager bookmarkManager) at System.Activities.Runtime.ActivityExecutor.ExecuteActivityWorkItem.ExecuteBody(ActivityExecutor executor, BookmarkManager bookmarkManager, Location resultLocation) Analysis: After analysis, I found the error by visiting the http://localhost:32843/SecurityTokenServiceApplication/securitytoken.svc and the error I got on the message is                                                                                                                                              Solution: The solution is basically getting more memory to the server. For development environment, you can restart your noderunner.exe or some other services to release some memories. To verify you have enough memory    you can browse to http://localhost:32843/SecurityTokenServiceApplication/securitytoken.svc , it should return the information below. Then you can republish your workflow and it will work like a charm.

    Read the article

  • Why don't I have a "Web Service References" menu item in excel/VBA?

    - by Draemon
    I'm trying to consume a SOAP web service from excel. Now according to This article (and confirmed by other articles and MSDN) if I do the following: Install the web services toolkit (I've installed v2.01) Install SOAP Toolkit 3.0 Add a reference to Microsoft Soap Type Library (I've tried v3.0 and an older one) I should get a "Web Service References" menu item in the Tools menu but I don't. I've also tried adding every reference that seemed to have anything to do with SOAP or XML, but it hasn't helped. Any ideas?

    Read the article

  • SOA Suite 11g (11.1.1.3): Creating a single server domain with SOA Suite and Oracle Service Bus func

    - by Chris Tomkins
    One of the major benefits of the latest version of the Oracle SOA products, SOA Suite 11gR1 PS2 (11.1.1.3) and Oracle Service Bus 11gR1 (11.1.1.3), is that both products depend on the same version of WebLogic Server 11g (10.3.3) and so can be installed in the same domain. If you are a developer building artifacts for both but with a laptop/desktop with limited resources then this is particularly useful, as you can install a single server domain (by default you get a domain with an admin server and multiple managed servers) which incorporates both sets of functionality. The viewlet below shows you how to go about creating such a single server domain, using the configuration wizard. Just click the Play button from the viewlet menu to start it. Note: As you have added all this functionality into a single server, it will take a few minutes to start up. As this is my first foray into the world of viewlets I’d be interested in your opinion as to whether you have found this useful and any tips/techniques for improving any future viewlets I may create. Technorati Tags: soa suite,osb,viewlet,weblogic,domain

    Read the article

  • The Ubuntu Advantage Service right for my non-profit?

    - by Robert
    My small 5 computer office currently runs on Ubuntu. 2 of the desktops run Windows7 in Sun Virtualbox software, and are used for Quickbooks. I am going off to college, and I am looking for a paid tech support solution to replace my IT position. I have an approx $300/mth budget, and I am wiling to discuss higher rates. Everyone in the office is currently comfortable with regular desktop usage, but I am handling all of the software installation and updates. I was hoping to get a total support package for all of their tech related questions, but I cannot find any services which will support linux. Is the Ubuntu Advantage Service something which can take my place? They would mostly need network help, printer help, and an occasional software compatibly troubleshooting session. If this is not a solution, does anyone know of a tech support forum/hotline which would cover all of this? Thank you for reading.

    Read the article

< Previous Page | 209 210 211 212 213 214 215 216 217 218 219 220  | Next Page >