Search Results

Search found 9326 results on 374 pages for 'enterprise integration'.

Page 56/374 | < Previous Page | 52 53 54 55 56 57 58 59 60 61 62 63  | Next Page >

  • Distributing a custom command line tool to enterprise servers

    - by Jeremy Baker
    I've been tasked with building a command line tool that we will be providing to our enterprise customers so that they can use the API to upload data to our platform. The API works with standard cURL requests, so I can do most of the basic functionality with simple bash scripting, although I would like to provide something that is solid and really makes it easy for them to use and I don't know what I don't know. It's been a good 8 years since I've really done any serious sysadmin work. Most of the good tools I use these days are written in Ruby or Python and have a standard distribution process (Gems, for example). However, I know rhel and other platforms have their own package managers. Finally, the question: In today's day and age, what language / distribution method should I consider in order to cover the widest range of platforms without having to build completely different versions for each platform? I'd also love any general feedback you have about building similar projects, or links to projects that you think do a good job of this now and have open source code that I could read. Thanks in advance!

    Read the article

  • Oracle WebCenter: Common User Experience Architecture

    - by kellsey.ruppel(at)oracle.com
    You may remember that the key goals of the new release of WebCenter are providing a Modern User Experience, unparalleled Application Integration, converging all the best of the existing portal platforms into WebCenter and delivering a Common User Experience Architecture.  In previous weeks we've provided an overview of Oracle WebCenter and discussed some of the other key goals and this week, we'll focus on how the new release of Oracle WebCenter delivers a Common User Experience Architecture.When Oracle talks about a Common User Experience Architecture, it really focuses on a core set of areas.  First, the way that information is accessed needs to be consistent and extensible so that as requirements change, the applications don't need to be rewritten for every change. Second, this information access layer needs to be securely accessible to any application, site, or any other channel that needs to leverage this information.  Third, there needs to be a consistent presentation layout, Oracle calls it a UI shell, so that all resources can fit together in a useable, productive way.  Fourth, there needs to be a common set of design patterns for how different menus, features, and services fit into this UI Shell for broad and productive usability.  Fifth, there needs to be a set of design patterns for the individual services that plug into this UI shell so that end users can move from one module of the application to another without new learning.  Finally, all of these layers need to be customizable in an easy way that insulates IT from patching and upgrading problems and allows the business owners the agility to quickly change with the market conditions.As Oracle has already announced, we will release our next generation of enterprise applications called Oracle Fusion Applications.  We have thousands of developers building these applications that all had different programming tool experience and UI design experience.  We've educated over 6,000 developers building Oracle Fusion Applications to leverage these Common User Experience Architecture patterns to speed their learning curve of the new Java standards as well as SOA principles to deliver a revolutionary new set of applications.  You could imagine the big challenge with getting all these developers with different backgrounds and different UI design skills to deliver a completely integrated application user experience.  This is why Oracle invested heavily in designing this Common User Experience Architecture, based on Oracle WebCenter and the Oracle Application Development Framework (ADF).  It pulls together the best practices and design patterns that Oracle development required in order to bring Fusion Applications to market and Oracle WebCenter is the user experience layer that all of this is surfaced through.  In this way, customers can quickly brand a deployment for new partnerships without having to redevelop a new site.  Or they can quickly add new options to the UI Shell to enable their line of business managers to quickly adapt to a new competitive product.  And with the core integration of the activities to produce a Business Activity Stream, customers are able to stay on top of all their key business actions when they happen as they happen and more importantly, the system can recommend actions or resources to help act on these activities.And we've authored this whole set of design patterns for Oracle development to take advantage of in delivering Fusion Applications.  We're also applying these design patterns to our existing eBusiness Suite, Peoplesoft, Siebel, and JD Edwards applications so that they can tie in the exact same way that Fusion Applications has been brought together.  This will provide customers with a complete Common User Experience Architecture for their entire ecosystem of applications within their enterprise whether they are from Oracle, another vender, or custom built applications. And this is all provided in the new release of Oracle WebCenter.  These design patterns cover elements around delivering a complete, aggregated menu of all the capabilities that their role allows independent of which application they are trying to access.   It means that as they move from one application to another, they will have a consistent user experience.  And if they are using an Oracle application, any customizations that are made to the application are preserved and managed through upgrades and patches.Be sure to check back this week as we share more information and resources on Oracle's Common User Experience Architecture.

    Read the article

  • Setting up Red Hat Enterprise Linux Server as a mail exchange server

    - by Syedur
    I am a Unix/Linux/Windows Server noob. So, keep that in mind before you throw your stones at my glass house. :P I have a Windows Server 2008 R2 machine that's acting as domain controller, Server A. It's also running a DNS server. I have a Red Hat Enterprise Linux Server 5.3, Server B that is intended for mail server. In order for the mail delivery to happen, I understand that I have to set an MX record on Server A and point it to Server B. Well, I did. I manually added a host name on Server A and pointed to Server B's IP address. Then I added an MX record and pointed it to the host name. That didn't do the trick. After taking the above steps, I used the "dig" command on Server B to lookup the MX record coming back from Server A and it wasn't what I was expecting. What am I doing wrong here? I have noticed that... my Windows machines that are joined to the domain (Server A) are listed under the host names. The machines that are not joined to the domain are not list. This is fine, I am not worried about this. What does concern me, do I have to join the Server B to domain in order for Server A to recognize as a valid host and forward the MX properly? If so, some simple steps on how to join Server B to the domain would also help.

    Read the article

  • network policy + WPA enterprise (tkip) Windows 2008 R2

    - by Aceth
    hi I've attempted the following guide and in a bit of a pickle. http://techblog.mirabito.net.au/?p=87 My main goal is to have a username / password based wireless authentication with active directory integration. I keep getting the error Network Policy Server denied access to a user. Contact the Network Policy Server administrator for more information. User: Security ID: domain\rhysbeta Account Name: rhysbeta Account Domain: domain Fully Qualified Account Name: domain\rhysbeta Client Machine: Security ID: NULL SID Account Name: - Fully Qualified Account Name: - OS-Version: - Called Station Identifier: 00-12-BF-00-71-3C:wirelessname Calling Station Identifier: 00-23-76-5D-1E-31 NAS: NAS IPv4 Address: 0.0.0.0 NAS IPv6 Address: - NAS Identifier: - NAS Port-Type: Wireless - IEEE 802.11 NAS Port: 2 RADIUS Client: Client Friendly Name: Belkin54g Client IP Address: x.x.x.10 Authentication Details: Connection Request Policy Name: Secure Wireless Connections Network Policy Name: Secure Wireless Connections Authentication Provider: Windows Authentication Server: srvr.example.com Authentication Type: EAP EAP Type: - Account Session Identifier: - Logging Results: Accounting information was written to the local log file. Reason Code: 22 Reason: The client could not be authenticated because the Extensible Authentication Protocol (EAP) Type cannot be processed by the server. ` I would love to have it so that non domain devices

    Read the article

  • Login with Enterprise Principal Name using sssd AD backend in Ubuntu 14.04 LTS

    - by Vinícius Ferrão
    I’m running sssd version 1.11 with the AD backend in Ubuntu 14.04 LTS (1.11.5-1ubuntu3) to authenticate users from Active Directory running on Windows Server 2012 R2, and I’m trying to achieve logins with the User Principal Name for all users of the domain. But the UPN are always Enterprise Principal Names. Let-me illustrate the problem with my user account: Domain: local.example.com sAMAccountName: ferrao UPN: [email protected] (there’s no local in the UPN) I can successfully login with the sAMAccountName atribute, which is fine, but I can’t login with [email protected] which is my UPN. The optimum solution for me is to allow logins from sAMAccountName and the UPN (User Principal Name). If’s not possible, the UPN should be the right way instead of the sAMAccountName. Another annoyance is the homedir pattern with those options in sssd.conf: default_shell = /bin/bash fallback_homedir = /home/%d/%u What I would like to achieve is separated home directories from the EPN. For example: /home/example.com/user /home/whatever.example.com/user But with this pattern I can’t map the way I would like to do. I’ve looked through man pages and was unable to find any answers for this issues. Thanks,

    Read the article

  • Anonymous access to SMB share hosted on Server 2008 R2 Enterprise

    - by bwerks
    Hi all, First off, I have read through this post and a whole slew of non-SF posts which seem to address the same or a similar problem, however I was still unable to fix my problem. I've got three machines in this situation: a domain-joined server that runs Server 2008 R2 Enterprise ("share server") a domain-joined workstation running XP Pro SP3 ("test server") a domain-unjoined test server running Server 2003 R2 SP2 ("workstation") The share server is exposing a share on the network that the test server must access--it's a Source/Symbol Server share for our debugging purposes. I believe visual studio simply accesses the the share with its own credentials in this case, meaning that the share must be accessible anonymously since the test server isn't joined to the domain and there's no opportunity to supply domain authentication. I've attempted a lot of things to avoid the authentication window when accessing the share: I've enabled the Guest account on the share server and given Guest full sharing/NTFS permissions for the share. I've given ANONYMOUS LOGON full sharing/NTFS permissions for the share. I've added my share to “Network Access: Shares that can be accessed anonymously” in LSP. I've disabled “Network access: Restrict anonymous access to Named Pipes and Shares” in LSP. I've enabled “Network access: Let Everyone permissions apply to anonymous users” in LSP. Added ANONYMOUS LOGON to “Access this computer from the network” in LSP. Added the Guest account to “Access this computer from the network” in LSP. Attempted to provision the share using the Share and Storage Management MMC snap-in. Unfortunately when I attempt to access the share from the test server, I still see the prompt and I'm forced to enter "Guest" manually. I also tried this workflow using the local administrator account on a workstation, and the same thing happens both with and without XP Simple File Sharing enabled. Any idea why I'm getting these results, or what I should have done differently?

    Read the article

  • Dell Driver Support for Latitude E6320 Windows 7 Enterprise

    - by IamPolaris
    I recently did a reinstall of Windows 7 Enterprise on a Dell Latitude E6320, which is a 64 bit system. After the install process, and doing typical Windows Update stuff, I looked at my Device Manager and found that I had devices which were missing drivers. My missing drivers: After going to the Dell Support site and looking at the files, and doing some sleuthing I found the following support document: http://downloads.dell.com/utility/Latitude%20E-Family%20%20Mobile%20Precision%20Re-Image%20How-To%20Guide%20-%20A03%20Rev%203%200.pdf This document hints in appendix C that the Broadcom USH is the Control Point Security and the Unknown device is Micro freefall sensor. The network controller is my wireless, as I cannot connect wirelessly, and the final missing driver I am not sure. Attempting to install the control point security exe on the support page will not work. After downloading, I am given the message that I am attempting to install a 32 bit driver on a 64 bit machine EVEN THOUGH I selected the win7 64 bit option from the support page. Beyond that, some of the drivers (Which are confusing to read and hard to understand what they do) and the system utilities which are supposedly supposed to make this process simpler will either a) not run because they are 32 bit exe's or b) the support page cannot find the file attempted to download. Is there anything I can do to get (at the very least) my wireless running, but idealistically all of my drivers. A solution which assumes Dell is completely incompetent would be ideal. :P Some forums have said that I should download the chipset driver, others say to get the system utility file (DSS_UTIL_WIN_R282536.EXE). I have had no luck as of yet...

    Read the article

  • SharePoint 2010 Enterprise wiki - [New page] missing

    - by icelava
    I am trying to ramp up knowledge on SharePoint deployment and usage (never did before), due to a direction to use SharePoint 2010 as a repository platform (wiki format) for our customer's infrastructure documentation. In my test virtual server, a new site of Enterprise wiki template was setup. Went into Site Actions Manage Site Features to activate Wiki Page Home Page. The default sub-web then went from /Pages to /SitePages and looks like the default Team template. The odd thing is the Site Actions is missing the New Page option. My colleague does not understand why this is the case, as it ought to be there. The original /Pages sub-web does have the option. What conditions are in play that influences the appearance of that option? UPDATE Another phenomenon observed is in the Site Actions View All Site Content view, the wiki document libraries listed in the grid will have their hyperlink (e.g. "Site Pages") lead straight to the direct default page. It would not show its own table listing of pages under that document library, unlike the original Pages document library, which expectedly show up as a listing. I wonder if this hints to any problems.

    Read the article

  • Ubiquitous BIP

    - by Tim Dexter
    The last number I heard from Mike and the PM team was that BIP is now embedded in more than 40 oracle products. That's a lot of products to keep track of and to help out with new releases, etc. Its interesting to see how internal Oracle product groups have integrated BIP into their products. Just as you might integrate BIP they have had to make a choice about how to integrate. 1. Library level - BIP is a pure java app and at the bottom of the architecture are a group of java libraries that expose APIs that you can use. they fall into three main areas, data extraction, template processing and formatting and delivery. There are post processing capabilities but those APIs are embedded withing the template processing libraries. Taking this integration route you are going to need to manage templates, data extraction and processing. You'll have your own UI to allow users to control all of this for themselves. Ultimate control but some effort to build and maintain. I have been trawling some of the products during a coffee break. I found a great post on the reporting capabilities provided by BIP in the records management product within WebCenter Content 11g. This integration falls into the first category, content manager looks after the report artifacts itself and provides you the UI to manage and run the reports. 2. Web Service level - further up in the stack is the web service layer. This is sitting on the BI Publisher server as a set of services, runReport and scheduleReport are the main protagonists. However, you can also manage the reports and users (locally managed) on the server and the catalog itself via the services layer.Taking this route, you still need to provide the user interface to choose reports and run them but the creation and management of the reports is all handled by the Publisher server. I have worked with a few customer on this approach. The web services provide the ability to retrieve a list of reports the user can access; then the parameters and LOVs for the selected report and finally a service to submit the report on the server. 3. Embedded BIP server UI- the final level is not so well supported yet. You can currently embed a report and its various levels of surrounding  'chrome' inside another html based application using a URL. Check the docs here. The look and feel can be customized but again, not easy, nor documented. I have messed with running the server pages inside an IFRAME, not bad, but not great. Taking this path should present the least amount of effort on your part to get BIP integrated but there are a few gotchas you need to get around. So a reasonable amount of choices with varying amounts of effort involved. There is another option coming soon for all you ADF developers out there, the ability to drop a BIP report into your application pages. But that's for another post.

    Read the article

  • Virtual Machine Network Services causes networking problems in Vista Enterprise 64 bit install

    - by Bill
    I have a Quad-core/8GB Vista Enterprise 64-bit (SP2) installation on which I installed Virtual PC 2007. I have a problem that is opposite of all that I found searching around the Internet--everybody has problems making network connections from their guest VM. When Virtual Machine Network Services is enabled in the protocol stack for my network card across a reboot, it causes access problems to the network. The amount of time to login in using a domain credentialed account is upwards of 3 minutes, then after reaching the desktop the network and sharing center shows that my connection to the domain is unauthenticated. Disabling and re-enabling the Virtual Machine Network Services (uncheck in network properties/apply/recheck/apply) fixes the problem. And as long as I have the VMNS disabled when I shutdown the restart runs smoothly. I just have to remember to enable after login and disable before shutdown. I have un-installed and re-installed Virtual PC 2007 multiple times with restarts between. The install consists of the SP1 + a KB patch for guest resolution fix. Any help would be greatly appreciated. Some additional information... At one point during my hairpulling and teethgnashing with this, I tried to ping my primary DC and observed some weird responses: (Our DC is 10.10.10.25, my dynamic IP was 10.10.10.203) Reply from 10.10.10.203, Destination host unreachable. Request timed out. Reply from 10.10.10.25: ... This is not consistently repeatable, but thought it might strike a chord with someone.

    Read the article

  • Enterprise Redirection Services?

    - by Aaron Alton
    This is probably a case of "if I new what it was called, I could google it in 5 minutes" - but I don't know what it's called. It's probably best to explain the requirement using an example. We have a number of services (vpn, owa, etc) which we host from one of our datacenters. We have a number of datacenters, and we technically have the infrastructure already in place to support these services at a number of our datacenters. To provide access to these "services", I would create an external DNS entry (ex. VPN.MyCompany.com Gateway IP for one of my DCs), and clients will connect to it via the DNS entry. Since I have multiple datacenters that can support this service, I could theoretically offer a "highly available, geographically dispersed" solution if I could point this DNS entry to some sort of third party who offers highly available "redirection" services. If my primary site goes down, I could just make a change via some management console and configure the redirector to point to a different DC. Of course, it would be fairly straightforward to set this sort of thing up on one of our servers, but that would kinda defeat the purpose of a highly available third party. Is anyone familiar with a service like this? I'm thinking something like DynDNS, but with Enterprise availability guarantees.

    Read the article

  • Enterprise level Ticketing and inventory system reccomendations [closed]

    - by TrackingSystem
    My company is sort off at a stand still when it comes to our technician ticketing and inventory system. We currently use Numara TrackIt! - which isn't cutting it to say the least. Dell recommended KACE, but it's web based which is what we would like to avoid. We need a good ticketing and inventory system with the following: Server/Client setup Client supports XP/Windows 7 Ent. Web Based as well as Client is a plus Technician ticketing Active Directory integration Inventory System (Asset tag tracking etc/PO tracking) Exchange integrated - when tickets are made you have an option to send to the requester. Something that will scale well Please, if anyone is a Systems Admin or has knowledge regarding use of a great ticketing system please let me know. We have a large international corporation - price honestly isn't an issue. Keep in mind this will be mainly used for technicians to create tickets, enter inventory(track PCS) and possible even an option to track purchase orders. We want an enterprise level ticketing system with these capabilities please help! Thank you.

    Read the article

  • PPTP VPN on Server 2008 Enterprise

    - by Mike K
    I asked this question on Server fault and was told that was not allowed so im moving it here. I am running Windows Server 2008 enterprise in my HOME network inside of vmware workstation. I am running this on my home network to setup a PPTP VPN connection at home. I have correctly setup everything I needed to make it work, including opening all the ports, 1723 and 43 (GRE). I am able to connect just fine, but when I connect I dont have internet unless I uncheck use remote gateway. The thing is, I want to use the remote gateway to route all my traffic through that connection. Can someone tell me why this isnt working and how to get it to work. When I have remote gateway checked, and I do an ipconfig I dont get a remote gateway for the VPN connection, its 0.0.0.0 when id assume if connected properly should be 192.168.1.254 (my ATT Home Router). Also, if I cant get the remote gateway issue to work, and I have to uncheck that box to get internet, does this mean my VPN session is no longer encrypted? I am fully aware the PPTP VPN is the weakest VPN encryption out there but still having that extra layer of security when im on an unsecure wifi connection makes me feel a bit better. Thank you for all your help in advance. Someone told me I need to setup a gateway or router configured on the server. If thats the case, how go I go about telling the remote co

    Read the article

  • Authlogic OpenID integration

    - by Craig
    I'm having difficulty getting OpenId authentication working with Authlogic. It appears that the problem arose with changes to the open_id_authentication plugin. From what I've read so far, one needs to switch from using gems to using plugins. Here's what I done thus far to get Authlogic-OpenID integration working: Removed relevant gems: authlogic authlogic-oid rack-openid ruby-openid * Installed, configured, and started the authlogic sample application (http://github.com/binarylogic/authlogic_example)--works as expected. This required: installing the authlogic (2.1.3) gem ($ sudo gem install authlogic) adding a dependency (config.gem "authlogic") to the environment.rb file. added migration to add open-id support to User model; ran migration; columns added as expected made changes to the UsersController and UserSessionsController to use blocks to save each. made changes to new user-sessions view to support open id (f.text_field :openid_identifier) installed open_id_authentication plugin ($ script/plugin install git://github.com/rails/open_id_authentication.git) installed the authlogic-oid plugin ($ script/plugin install git://github.com/binarylogic/authlogic_openid.git) installed the plugin ($ script/plugin install git://github.com/glebm/ruby-openid.git) restarted mongrel (CTRL-C; $ script/server) Mogrel failed to start, returning the following error: /Library/Ruby/Site/1.8/rubygems/custom_require.rb:31:in `gem_original_require': no such file to load -- rack/openid (MissingSourceFile) from /Library/Ruby/Site/1.8/rubygems/custom_require.rb:31:in `require' from /Users/craibuc/NetBeansProjects/authlogic_example/vendor/rails/activesupport/lib/active_support/dependencies.rb:156:in `require' from /Users/craibuc/NetBeansProjects/authlogic_example/vendor/rails/activesupport/lib/active_support/dependencies.rb:521:in `new_constants_in' from /Users/craibuc/NetBeansProjects/authlogic_example/vendor/rails/activesupport/lib/active_support/dependencies.rb:156:in `require' from /Users/craibuc/NetBeansProjects/authlogic_example/vendor/plugins/open_id_authentication/lib/open_id_authentication.rb:3 from /Library/Ruby/Site/1.8/rubygems/custom_require.rb:31:in `gem_original_require' from /Library/Ruby/Site/1.8/rubygems/custom_require.rb:31:in `require' from /Users/craibuc/NetBeansProjects/authlogic_example/vendor/rails/activesupport/lib/active_support/dependencies.rb:156:in `require' from /Users/craibuc/NetBeansProjects/authlogic_example/vendor/rails/activesupport/lib/active_support/dependencies.rb:521:in `new_constants_in' from /Users/craibuc/NetBeansProjects/authlogic_example/vendor/rails/activesupport/lib/active_support/dependencies.rb:156:in `require' from /Users/craibuc/NetBeansProjects/authlogic_example/vendor/plugins/open_id_authentication/init.rb:5:in `evaluate_init_rb' from ./script/../config/../vendor/rails/railties/lib/rails/plugin.rb:146:in `evaluate_init_rb' from /Users/craibuc/NetBeansProjects/authlogic_example/vendor/rails/activesupport/lib/active_support/core_ext/kernel/reporting.rb:11:in `silence_warnings' from ./script/../config/../vendor/rails/railties/lib/rails/plugin.rb:142:in `evaluate_init_rb' from ./script/../config/../vendor/rails/railties/lib/rails/plugin.rb:48:in `load' from ./script/../config/../vendor/rails/railties/lib/rails/plugin/loader.rb:38:in `load_plugins' from ./script/../config/../vendor/rails/railties/lib/rails/plugin/loader.rb:37:in `each' from ./script/../config/../vendor/rails/railties/lib/rails/plugin/loader.rb:37:in `load_plugins' from ./script/../config/../vendor/rails/railties/lib/initializer.rb:348:in `load_plugins' from ./script/../config/../vendor/rails/railties/lib/initializer.rb:163:in `process' from ./script/../config/../vendor/rails/railties/lib/initializer.rb:113:in `send' from ./script/../config/../vendor/rails/railties/lib/initializer.rb:113:in `run' from /Users/craibuc/NetBeansProjects/authlogic_example/config/environment.rb:13 from /Library/Ruby/Site/1.8/rubygems/custom_require.rb:31:in `gem_original_require' from /Library/Ruby/Site/1.8/rubygems/custom_require.rb:31:in `require' from /Users/craibuc/NetBeansProjects/authlogic_example/vendor/rails/activesupport/lib/active_support/dependencies.rb:156:in `require' from /Users/craibuc/NetBeansProjects/authlogic_example/vendor/rails/activesupport/lib/active_support/dependencies.rb:521:in `new_constants_in' from /Users/craibuc/NetBeansProjects/authlogic_example/vendor/rails/activesupport/lib/active_support/dependencies.rb:156:in `require' from /Users/craibuc/NetBeansProjects/authlogic_example/vendor/rails/railties/lib/commands/server.rb:84 from /Library/Ruby/Site/1.8/rubygems/custom_require.rb:31:in `gem_original_require' from /Library/Ruby/Site/1.8/rubygems/custom_require.rb:31:in `require' from script/server:3 I suspect this is related the rack-openid gem, but as it was dependent upon the ruby-openid gem, it was removed when the ruby-openid gem was removed. Perhaps this can be installed as a plugin. Any assistance with this matter is greatly appreciated--I'm just about to give up on OpenId integration. * ruby-openid (2.1.2) is installed at /System/Library/Frameworks/Ruby.framework/Versions/1.8/usr/lib/ruby/gems/1.8. I'm not certain if this is affecting anything. In any case, I'm not sure how to uninstall it or if I should. ** edit ** It appears that there are a number of gems in the /Library/Ruby/Gems/1.8/gems directory that may be causing an issue: authlogic-oid (1.0.4) rack-openid (1.0.3) ruby-openid (2.1.7) Questions: - why doesn't the gem list command list these gems? - Why doesn't the gem uninstall command remove these gems?

    Read the article

  • Enterprise Tape Backup solutions

    - by Tom O'Connor
    I'm currently attempting to re-architect a backup solution where I'm working. We've got 2 NAS devices, one in the office, one in the datacentre. The servers in the DC back up to the DC NAS, which is then replicated to the Office NAS. The office NAS exports shares as CIFS and NFS, this bit is fine. At some point, I'll have to expand our storage capacity, currently we've got about 1.4TB of storage space, which is about 96% full. Previously, the tape backup was a script that ran tar a few times and squirted data onto a tape. It worked, but was by no means a perfect solution. Restores are a bit of a pest, adding new data to the backup requires editing the script as root. It's just all a bit non-ideal. I've been evaluating a number of "enterprise" ready backup solutions, such as Yosemite Backup from Barracuda, Acronis Backup/Restore, and something from Arkeia. In the process of evaluating these, I've found 2 big problems. Not all of them allow backup of mounted devices (such as a NFS mounted NAS) Many of these applications don't like our tape device. For the most part, (1) is essential. Our NAS has a feeble processor and can't run applications like backup agents. I suspect that the biggest problem is the tape device, which is a HP C7438A DAT72 connected via USB. Questions: Has anyone else got an USB DAT72 device working with similar software? Is there a better way to back up data from an "appliance" NAS device on which you can't run an agent? Would I be totally out of my mind to specify a cheap HP or Dell server with a couple of 1TB hard disks, and a SAS card to then talk to an HP Ultrium (or similar) device? The biggest drawback to this would be cost (400ish for the server, 200 for the SAS connectivity and 1700 for a LTO4 device) Notes: I'd love to be able to say that I'd get rid of tapes entirely, and use some form of hard disk backup. In a previous job, we had LaCie USB drives, which were decidedly unreliable.

    Read the article

  • Oracle Consulting North America is now live on PeopleSoft Services Procurement and PeopleSoft Resource Management

    - by Howard Shaw
    Last month, Oracle's own internal consulting group (OCS North America) went live on PeopleSoft Services Procurement and PeopleSoft Resource Management to manage all aspects of identifying, recruiting, and deploying billable subcontractors on North America Applications customer consulting projects. The primary goals were to enhance the subcontractor staffing process, improve operational and informational processes, and improve collaboration between the Oracle NA Consulting Subcontractor Program and subcontractor suppliers. Over 200 registered external suppliers access the tool, review open needs and competitively bid their resources to work on NA Applications projects. This implementation highlights the usage of Oracle’s own solutions to streamline and enhance business operations, as the PeopleSoft 9.1 applications (Services Procurement and Resource Management) were deployed using Sun hardware, Oracle Enterprise Linux, and Oracle Virtual Machines.For more information, please navigate to the following web pages: PeopleSoft Services Procurement PeopleSoft Resource Management

    Read the article

  • Catch Up on Your Reading

    - by [email protected]
    AutoVue 20.0 was a major release which included many new features and enhancements. We eagerly shared the news with members of the media, who in turn wrote about AutoVue enterprise visualization in various online articles. Here is a summary of the articles featuring AutoVue 20.0. Happy reading! Oracle Unveils AutoVue 20.0 Desktop Engineering; April 5, 2010 Oracle Upgrades Document Visualization Tool Managing Automation; April 5, 2010 Oracle's AutoVue 20.0 Enhances Visual Document Collaboration CMS Wire; April 6, 2010 Oracle Turns Attention to Project and Document Management Channel Insider; April 7, 2010 Oracle Unveils AutoVue 20.0 Database Trends and Applications; April 7, 2010

    Read the article

  • Architecture for interfacing multiple applications

    - by Erwin
    Let's say you have a Master Database and a few External/Internal applications that use WebServices to interface data. What would be your preferred architecture to interface data from and to those applications? Would you put some sort of Enterprise Service Bus in between? Like BizTalk? Or something cheaper? We don't want to block applications while they are interfacing, but we do want to use return codes from the interfaces to determine if we need to take some actions in the originating application or not.

    Read the article

  • Is Ubuntu workable as a laptop for an IT consultant?

    - by Eric Wilson
    I work as a consultant programmer, typically in large businesses. I use a Windows Laptop, and many of my colleagues use a Mac. My personal preference would be to run Ubuntu if I could have complete control over my development environment. But I will have occasional need for Microsoft specific products, especially IE. My colleagues that use a Mac often run Windows on a virtual machine for these situations. My question is: Is Ubuntu a workable solution for the laptop of an enterprise programmer? For example, is it as easy to run Windows on a VM on Ubuntu as it is on a Mac? Has anyone out there tried this? Is there any particular reason why Ubuntu would not serve as well as a Mac for development in this environment? Note that I am not doing .NET development, so I am typically dealing with Java that is going to be run on an Apache server and used by clients running Windows.

    Read the article

  • AutoVue Success at Siemens Energy!

    - by prasenjit.niyogi(at)oracle.com
    Siemens Improves Review and Collaboration with Visually Enabled Engineering Platform Siemens Energy Incorporated offers products, solutions, and services for the entire energy conversion chain--from power generation and transmission to distribution. The organization primarily serves energy utilities and industrial companies. Siemens faced challenges in the form of: Long design review cycles and potential field service delays that stemmed from users' inability to digitally access, view, and collaborate on design documents for energy-related projects stored in SAP High costs and IT administration complexity that was caused by multiple design visualization tools Learn how the customized integration of Oracle's AutoVue with SAP, thanks to Oracle partner Lifecycle Technology, significantly streamlined design review processes, improved productivity, and eliminated paper-based collaboration for the field service technicians and engineers. Read the complete snapshot here

    Read the article

  • Fujitsu und Oracle rücken enger zusammen

    - by A&C Redaktion
    Gerade wurde bekannt, dass sich Oracle und Fujitsu auf eine noch intensivere Zusammenarbeit geeinigt haben.Spannend für die Partner dürfte vor allem sein, dass Fujitsu nun laut Vertrag mit dem Oracle PartnerNetwork auch als Systemintegrator und Lösungsanbieter für das gesamte Produktportfolio aktiv wird.Ansonsten wird, wie zu erwarten, die Weiterentwicklung der SPARC Enterprise "M-Series"- Server voran getrieben. Das neue Gehäuse mit den Logos beider Firmen wurde ja bereits im Dezember präsentiert. Nun schwebt den Partnern eine Leistungssteigerung um das 15-fache innerhalb der nächsten fünf Jahre vor.Gemeinsame Tests sollen zudem das Zusammenspiel von Soft- und Hardware beider Firmen optimieren, vor allem in geschäftskritischen Umgebungen.Das klingt nach frischem Wind in der über 20-jährigen Partnerschaft. Oracle CEO Larry Ellison ist sichtlich zufrieden: „Die Partnerschaft zwischen Oracle und Fujitsu war nie stärker!"Weitere Details zur Kooperation gibt's hier, in der gemeinsamen Pressemitteilung beider Unternehmen.

    Read the article

  • Pros and cons of developing modern services in Java

    - by r3mus
    I'm interested in the philosophical and architectural justification (or lack thereof) in using Java to develop in today's modern world (exclude mobile/embedded platforms of course). Why would one choose to develop (or not develop) a back-end in Java? Why would one choose to develop (or not develop) a front-end UI in Java? Why do large enterprises lean towards developing in Java rather than adopt more modern (and standardized) technologies? *disclaimer: I'm not a fan of Java in the enterprise, I'm simply curious what drives enterprises to continue the trend.

    Read the article

  • What are options for 3rd Party Centralized Software Settings Management?

    - by Jeff Martin
    I am an architect in an enterprise looking to build a SaaS solution. Our products are distributed over many different deployable containers, Web Services, Web UI's, etc. I am looking for some open-source or 3rd party software solution to manage the settings of our application. These would be similar to the settings you might find in Word or Eclipse or Visual Studio. The settings would control various behaviors and features of the product. (Probably not settings like which database to connect to but more like, should I show line numbers on the page or not by default..). Ideally, we would be able to store values for different dimensions (by tenant, by user, by application environment... ) Because we have so many different deployables, I am looking for a centralized solution that can provide a web service that each of the deployables can get their individual settings from. Does anyone know of a centralized service providing this sort of features or give me some help in searching for an alternative to rolling our own?

    Read the article

  • what tools and technologies an Technical Java Architect must hava [on hold]

    - by vicky
    I have more then eight years of experience in different Java tools, technologies and domains. Currently I am employed as a Technical Java Architect. I have worked on mobile, web, webservices, database, backend and desktop applications during this period. Everything was OK untill I got a few very good offers regarding an enterprise architect, solutions architect, java architects roles. But each one required different tool set. One was regarding experience in all Apache stack and technologies. So, help me which tools and technologies a real java technical Architect shuold have ? So that I can equip myself with that. Thanks.

    Read the article

  • Tuesday at Oracle OpenWorld 2012 - Must See Session: “Jump-starting Integration Projects with Oracle AIA Foundation Pack”

    - by Lionel Dubreuil
    Don’t miss this “CON8769 - Jump-starting Integration Projects with Oracle AIA Foundation Pack“session: Date: Tuesday, Oct 2 Time: 1:15 PM - 2:15 PM Location: Marriott Marquis - Salon 7 Speakers: Robert Wunderlich - Principal Product Manager, Oracle Munazza Bukhari - Group Manager, AIA FP Product Management, Oracle The Oracle Application Integration Architecture Foundation Pack development lifecycle prescribes the best practice methodology for developing integrations between applications. The lifecycle is supported by a toolset that focuses on the architects and developers. Attend this session to understand how Oracle AIA Foundation Pack can jump-start integration project development and boost developer productivity. It demonstrates what the product does today and showcases new features such as support for building direct integrations. Objectives for this session are to: Understand how to boost developer productivity Hear about support for direct integrations Learn what’s new in Oracle AIA Foundation Pack

    Read the article

< Previous Page | 52 53 54 55 56 57 58 59 60 61 62 63  | Next Page >