Search Results

Search found 3707 results on 149 pages for 'secure'.

Page 42/149 | < Previous Page | 38 39 40 41 42 43 44 45 46 47 48 49  | Next Page >

  • Why Wouldn't Root Be Able to Change a Zone's IP Address in Oracle Solaris 11?

    - by rickramsey
    You might assume that if you have root access to an Oracle Solaris zone, you'd be able to change the root's IP address. If so, you'd proceed along these lines ... First, you'd log in: root@global_zone:~# zlogin user-zone Then you'd remove the IP interface: root@user-zone:~# ipadm delete-ip vnic0 Next, you'd create a new IP interface: root@user-zone:~# ipadm create-ip vnic0 Then you'd assign the IP interface a new IP address (10.0.0.10): root@user-zone:~# ipadm create-addr -a local=10.0.0.10/24 vnic0/v4 ipadm: cannot create address: Permission denied Why would that happen? Here are some potential reasons: You're in the wrong zone Nobody bothered to tell you that you were fired last week. The sysadmin for the global zone (probably your ex-girlfriend) enabled link protection mode on the zone with this sweet little command: root@global_zone:~# dladm set-linkprop -p \ protection=mac-nospoof,restricted,ip-nospoof vnic0 How'd your ex-girlfriend learn to do that? By reading this article: Securing a Cloud-Based Data Center with Oracle Solaris 11 by Orgad Kimchi, Ron Larson, and Richard Friedman When you build a private cloud, you need to protect sensitive data not only while it's in storage, but also during transmission between servers and clients, and when it's being used by an application. When a project is completed, the cloud must securely delete sensitive data and make sure the original data is kept secure. These are just some of the many security precautions a sysadmin needs to take to secure data in a cloud infrastructure. Orgad, Ron, and Richard and explain the rest and show you how to employ the security features in Oracle Solaris 11 to protect your cloud infrastructure. Part 2 of a three-part article on cloud deployments that use the Oracle Solaris Remote Lab as a case study. About the Photograph That's the fence separating a small group of tourist cabins from a pasture in the small town of Tropic, Utah. Follow Rick on: Personal Blog | Personal Twitter | Oracle Forums   Follow OTN Garage on: Web | Facebook | Twitter | YouTube

    Read the article

  • Oracle at Information Security and Risk Management Conference (ISACA Conferences)

    - by Tanu Sood
    The North America Information Security and Risk Management (ISRM) Conference hosted by ISACA will be held this year from November 14 - 16 in Las Vegas, Nevada and Oracle is a platinum sponsor. The ISRM / IT GRC event is not only designed to meet the exact needs of information security, governance, compliance and risk management professionals like you, but also gives you the tools you need to solve the issues you currently face. The event builds on and includes the key elements of information security, governance, compliance and risk management practices, and offers a fresh perspective on current and future trends. As a Platinum Sponsor Oracle will not only have an opportunity to demonstrate but talk through our strategic roadmap and support to ensure all organizations understand our key role within the industry to ensure corporate data and information remains safe. Join us at the Lunch and Learn to learn more about the latest advances in Oracle Identity Management. Lunch and Learn Session: Trends in Identity Management Speaker: Mike Neuenschwander, Senior Product Development Director, Oracle Identity Management As enterprises embrace mobile and social applications, security and audit have moved into the foreground. The way we work and connect with our customers is changing dramatically and this means, re-thinking how we secure the interaction and enable the experience. Work is an activity not a place - mobile access enables employees to work from any device anywhere and anytime. Organizations are utilizing "flash teams" - instead of a dedicated group to solve problems, organizations utilize more cross-functional teams. Work is now social - email collaboration will be replaced by dynamic social media style interaction. In this session, we will examine these three secular trends and discuss how organizations can secure the work experience and adapt audit controls to address the "new work order". We also recommend you bookmark the following session: T1 Session 301: Gone in 60 Seconds: Mitigating Database Security Risk Friday, November 16, 8:30 am – 9:30 am And, do be sure to stop by our booth, # 100 & #102, to not only network with our Product Development Team, but also get an onsite demonstration of Oracle Security Solutions. See you there? ISRM /  IT GRC November 14 – 16, 2012 Mirage Casino-Hotel 3400 Las Vegas Boulevard South Las Vegas, NV, 89109

    Read the article

  • Today @ OOW: Identity Management for the SoMoClo world

    - by B Shashikumar
    Today at OpenWord, we have a very interesting lineup of Identity Management sessions that discuss how to extend identity management securrley to cloud, mobile and social ecosystems. Here are 3 of the can’t miss identity management sessions today: Identity Management and the Cloud: Security is regularly identified as the #1 barrier to cloud service adoption. Oracle Identity Management is designed to help customers extend and connect core identity services to SaaS applications and systems. This session explores how organizations are using Oracle Identity Management with cloud services and how some customers are offering identity management as a cloud service. Real-time External Authorization for Applications, Middleware and Databases: Externalization of authorization is key to manageability and audit. This session covers enterprise wide authorization solution deployment best practices and real-world examples of using Oracle Entitlements Server—the one-stop standards-compliant authorization solution—for middleware, applications, and data. Delivering Secure WiFi on the Tube as an Olympics Legacy from London 2012: In this session, Virgin Media, the U.K.’s first combined provider of broadband, TV, mobile, and home phone services, shares how it is providing free secure Wi-Fi services to the London Underground, using Oracle Virtual Directory and Oracle Entitlements Server, leveraging back-end legacy systems that were never designed to be externalized. As an Olympics 2012 legacy, the Oracle architecture will form a platform to be consumed by other Virgin Media services such as video on demand. Here is the complete lineup of Identity Management sessions today at OOW.

    Read the article

  • Oracle Linux Tips and Tricks: Using SSH

    - by Robert Chase
    Out of all of the utilities available to systems administrators ssh is probably the most useful of them all. Not only does it allow you to log into systems securely, but it can also be used to copy files, tunnel IP traffic and run remote commands on distant servers. It’s truly the Swiss army knife of systems administration. Secure Shell, also known as ssh, was developed in 1995 by Tau Ylonen after the University of Technology in Finland suffered a password sniffing attack. Back then it was common to use tools like rcp, rsh, ftp and telnet to connect to systems and move files across the network. The main problem with these tools is they provide no security and transmitted data in plain text including sensitive login credentials. SSH provides this security by encrypting all traffic transmitted over the wire to protect from password sniffing attacks. One of the more common use cases involving SSH is found when using scp. Secure Copy (scp) transmits data between hosts using SSH and allows you to easily copy all types of files. The syntax for the scp command is: scp /pathlocal/filenamelocal remoteuser@remotehost:/pathremote/filenameremote In the following simple example, I move a file named myfile from the system test1 to the system test2. I am prompted to provide valid user credentials for the remote host before the transfer will proceed.  If I were only using ftp, this information would be unencrypted as it went across the wire.  However, because scp uses SSH, my user credentials and the file and its contents are confidential and remain secure throughout the transfer.  [user1@test1 ~]# scp /home/user1/myfile user1@test2:/home/user1user1@test2's password: myfile                                    100%    0     0.0KB/s   00:00 You can also use ssh to send network traffic and utilize the encryption built into ssh to protect traffic over the wire. This is known as an ssh tunnel. In order to utilize this feature, the server that you intend to connect to (the remote system) must have TCP forwarding enabled within the sshd configuraton. To enable TCP forwarding on the remote system, make sure AllowTCPForwarding is set to yes and enabled in the /etc/ssh/sshd_conf file: AllowTcpForwarding yes Once you have this configured, you can connect to the server and setup a local port which you can direct traffic to that will go over the secure tunnel. The following command will setup a tunnel on port 8989 on your local system. You can then redirect a web browser to use this local port, allowing the traffic to go through the encrypted tunnel to the remote system. It is important to select a local port that is not being used by a service and is not restricted by firewall rules.  In the following example the -D specifies a local dynamic application level port forwarding and the -N specifies not to execute a remote command.   ssh –D 8989 [email protected] -N You can also forward specific ports on both the local and remote host. The following example will setup a port forward on port 8080 and forward it to port 80 on the remote machine. ssh -L 8080:farwebserver.com:80 [email protected] You can even run remote commands via ssh which is quite useful for scripting or remote system administration tasks. The following example shows how to  log in remotely and execute the command ls –la in the home directory of the machine. Because ssh encrypts the traffic, the login credentials and output of the command are completely protected while they travel over the wire. [rchase@test1 ~]$ ssh rchase@test2 'ls -la'rchase@test2's password: total 24drwx------  2 rchase rchase 4096 Sep  6 15:17 .drwxr-xr-x. 3 root   root   4096 Sep  6 15:16 ..-rw-------  1 rchase rchase   12 Sep  6 15:17 .bash_history-rw-r--r--  1 rchase rchase   18 Dec 20  2012 .bash_logout-rw-r--r--  1 rchase rchase  176 Dec 20  2012 .bash_profile-rw-r--r--  1 rchase rchase  124 Dec 20  2012 .bashrc You can execute any command contained in the quotations marks as long as you have permission with the user account that you are using to log in. This can be very powerful and useful for collecting information for reports, remote controlling systems and performing systems administration tasks using shell scripts. To make your shell scripts even more useful and to automate logins you can use ssh keys for running commands remotely and securely without the need to enter a password. You can accomplish this with key based authentication. The first step in setting up key based authentication is to generate a public key for the system that you wish to log in from. In the following example you are generating a ssh key on a test system. In case you are wondering, this key was generated on a test VM that was destroyed after this article. [rchase@test1 .ssh]$ ssh-keygen -t rsaGenerating public/private rsa key pair.Enter file in which to save the key (/home/rchase/.ssh/id_rsa): Enter passphrase (empty for no passphrase): Enter same passphrase again: Your identification has been saved in /home/rchase/.ssh/id_rsa.Your public key has been saved in /home/rchase/.ssh/id_rsa.pub.The key fingerprint is:7a:8e:86:ef:59:70:ef:43:b7:ee:33:03:6e:6f:69:e8 rchase@test1The key's randomart image is:+--[ RSA 2048]----+|                 ||  . .            ||   o .           ||    . o o        ||   o o oS+       ||  +   o.= =      ||   o ..o.+ =     ||    . .+. =      ||     ...Eo       |+-----------------+ Now that you have the key generated on the local system you should to copy it to the target server into a temporary location. The user’s home directory is fine for this. [rchase@test1 .ssh]$ scp id_rsa.pub rchase@test2:/home/rchaserchase@test2's password: id_rsa.pub                  Now that the file has been copied to the server, you need to append it to the authorized_keys file. This should be appended to the end of the file in the event that there are other authorized keys on the system. [rchase@test2 ~]$ cat id_rsa.pub >> .ssh/authorized_keys Once the process is complete you are ready to login. Since you are using key based authentication you are not prompted for a password when logging into the system.   [rchase@test1 ~]$ ssh test2Last login: Fri Sep  6 17:42:02 2013 from test1 This makes it much easier to run remote commands. Here’s an example of the remote command from earlier. With no password it’s almost as if the command ran locally. [rchase@test1 ~]$ ssh test2 'ls -la'total 32drwx------  3 rchase rchase 4096 Sep  6 17:40 .drwxr-xr-x. 3 root   root   4096 Sep  6 15:16 ..-rw-------  1 rchase rchase   12 Sep  6 15:17 .bash_history-rw-r--r--  1 rchase rchase   18 Dec 20  2012 .bash_logout-rw-r--r--  1 rchase rchase  176 Dec 20  2012 .bash_profile-rw-r--r--  1 rchase rchase  124 Dec 20  2012 .bashrc As a security consideration it's important to note the permissions of .ssh and the authorized_keys file.  .ssh should be 700 and authorized_keys should be set to 600.  This prevents unauthorized access to ssh keys from other users on the system.   An even easier way to move keys back and forth is to use ssh-copy-id. Instead of copying the file and appending it manually to the authorized_keys file, ssh-copy-id does both steps at once for you.  Here’s an example of moving the same key using ssh-copy-id.The –i in the example is so that we can specify the path to the id file, which in this case is /home/rchase/.ssh/id_rsa.pub [rchase@test1]$ ssh-copy-id -i /home/rchase/.ssh/id_rsa.pub rchase@test2 One of the last tips that I will cover is the ssh config file. By using the ssh config file you can setup host aliases to make logins to hosts with odd ports or long hostnames much easier and simpler to remember. Here’s an example entry in our .ssh/config file. Host dev1 Hostname somereallylonghostname.somereallylongdomain.com Port 28372 User somereallylongusername12345678 Let’s compare the login process between the two. Which would you want to type and remember? ssh somereallylongusername12345678@ somereallylonghostname.somereallylongdomain.com –p 28372 ssh dev1 I hope you find these tips useful.  There are a number of tools used by system administrators to streamline processes and simplify workflows and whether you are new to Linux or a longtime user, I'm sure you will agree that SSH offers useful features that can be used every day.  Send me your comments and let us know the ways you  use SSH with Linux.  If you have other tools you would like to see covered in a similar post, send in your suggestions.

    Read the article

  • How can I pass referrer header from my https domain to http domains?

    - by nutcracker
    My website is 100% https. I have links to other http domains. The referrer header is not set when linking from a https page to a http page. From http://en.wikipedia.org/wiki/HTTP_referrer If a website is accessed from a HTTP Secure (HTTPS) connection and a link points to anywhere except another secure location, then the referer field is not sent. I would prefer that other domains can see the referrer so that they know that traffic comes from my domain. Is there a way to force this header or is there another solution? Update I've done some basic testing using a redirect: http page -- link to http --> 301 redirect --> http page = referrer intact https page -- link to https --> 301 redirect --> http page = referrer blank https page -- link to http --> 301 redirect --> http page = referrer blank https page -- link to http --> 302 redirect --> http page = referrer blank The referrer is lost when linking from a https page to a http redirect page on my own domain. So there is no referrer on the redirect.

    Read the article

  • Huge Opportunity in Small Things

    - by Tori Wieldt
    Addressing the strong demand for Java in the embedded market, Oracle is hosting a new Java Embedded @ JavaOne event in San Francisco October 3-4. The event allows decision makers to attend the Java Embedded @ JavaOne business-focused program, while their IT/development staff can attend the technically-focused JavaOne conference. [Obligatory comment about suits & ties vs. jeans & T-shirts removed.] The two-day event includes keynotes, sessions and demonstrations. In his keynote this morning, Judson Althoff, Senior Vice President of Worldwide Alliances and Channels and Embedded Sales, Oracle explained  Devices are all around us - on 24x7, connected all the time. The explosion of devices is the next IT revolution. Java is the right solution for this space. Java embedded solutions provide a framework to  provision, manage, and secure devices.  Java embedded solutions also provide the ability to aggregate, process and analyze multitude of data.  Java is one platform to program them all. Terrance Barr, Java Evangelist and Java ME expert is enthusiastic about the huge opportunity, "It's the right time and right place for Java Embedded," he said, "Oracle is looking for partners who want to take advantage of this next wave in IT." The Embedded space continues to heat up. Today, Cinterion launched the EHS5, an ultra compact, high-speed M2M communication module providing secure wireless connectivity for a wide variety of industrial applications. Last week, Oracle announced Oracle Java ME Embedded 3.2, a complete client Java runtime Optimized for resource-constrained, connected, embedded systems, Oracle Java Wireless Client 3.2, Oracle Java ME Software Development Kit (SDK) 3.2, and Oracle Java Embedded Suite 7.0 for larger embedded devices. There is a huge opportunity in small things. 

    Read the article

  • OpenWorld 2011 Video Index

    - by Chris Kawalek
    We did quite a few virtualization videos this year at Oracle OpenWorld 2011. You can find all these and more on our YouTube channel. Virtualization Wrapup Adam Hawley discusses the Oracle virtualization presence at Oracle OpenWorld 2011. http://www.youtube.com/oraclevirtualization#p/f/2/53_SQYljqN4 Oracle Applications on iPad Brad Lackey shows how you can access Oracle Applications on iPad. http://www.youtube.com/oraclevirtualization#p/f/9/3Ug5km3uxEQ Thinkquest.org and Oracle VM Dan Herrup describes how Thinkquest.org is using Oracle VM to help kids learn how to solve real world problems with computer technology. http://www.youtube.com/oraclevirtualization#p/f/6/Bw-km5kqzEo Avaya and Oracle Virtualization See Oracle desktop virtualization in action at Avaya's booth. http://www.youtube.com/oraclevirtualization#p/f/4/xIHRIijEPkM Eco-Features of Sun Ray Clients Michael Dann shows off the Sun Ray 3 Plus and talks about the eco benefits of Oracle's extremely low power consumption client device for desktop virtualization. http://www.youtube.com/oraclevirtualization#p/f/3/ulArHGe1OmM Application and Desktop Access with Oracle Secure Global Desktop Watch Jeff Harvey do a quick demo of Oracle Secure Global Desktop accessing Oracle Applications. http://www.youtube.com/oraclevirtualization#p/f/5/g_ikA7dwh0g Oracle VM VirtualBox for VDI Andy Hall describes how enterprises leverage Oracle VM VirtualBox as part of their VDI deployments. http://www.youtube.com/oraclevirtualization#p/f/8/WmkeYlzgnZ8 TechCast Live: The Coolest Virtualization Products Interview with Andy Hall about the desktop virtualization portfolio. http://www.youtube.com/oraclevirtualization#p/f/7/VMkrAhZ83AA

    Read the article

  • A Huge Opportunity in Small Things

    - by Tori Wieldt
    Addressing the strong demand for Java in the embedded market, Oracle is hosting a new Java Embedded @ JavaOne event in San Francisco October 3-4. The event allows decision makers to attend the Java Embedded @ JavaOne business-focused program, while their IT/development staff can attend the technically-focused JavaOne conference. [Obligatory comment about suits & ties vs. jeans & T-shirts removed.] The two-day event includes keynotes, sessions and demonstrations. In his keynote this morning, Judson Althoff, Senior Vice President of Worldwide Alliances and Channels and Embedded Sales, Oracle explained  Devices are all around us - on 24x7, connected all the time. The explosion of devices is the next IT revolution. Java is the right solution for this space. Java embedded solutions provide a framework to  provision, manage, and secure devices.  Java embedded solutions also provide the ability to aggregate, process and analyze multitude of data.  Java is one platform to program them all. Terrance Barr, Java Evangelist and Java ME expert is enthusiastic about the huge opportunity, "It's the right time and right place for Java Embedded," he said, "Oracle is looking for partners who want to take advantage of this next wave in IT." The Embedded space continues to heat up. Today, Cinterion launched the EHS5, an ultra compact, high-speed M2M communication module providing secure wireless connectivity for a wide variety of industrial applications. Last week, Oracle announced Oracle Java ME Embedded 3.2, a complete client Java runtime Optimized for resource-constrained, connected, embedded systems, Oracle Java Wireless Client 3.2, Oracle Java ME Software Development Kit (SDK) 3.2, and Oracle Java Embedded Suite 7.0 for larger embedded devices. There is a huge opportunity in small things. 

    Read the article

  • HP Pavilion 15 with AMD dual graphics - Ubuntu live environment not starting

    - by creepus
    I've had this laptop for about a day now and have decided to try Ubuntu on it and determine if I want to install it. I created a USB, it booted (Secure Boot was on, I tried with Secure Boot off to no effect), and then the problem occurred. The screen turned off for a second, turned back on to a black screen, shut off again and turned back on with a dialogue box telling me that the system had to use low-graphics mode. I clicked OK, selected low-graphics mode from the menu and clicked OK. The screen switched to the boot messages and did not go any further than this. Ctrl+Alt+DEL started rebooting the laptop though. I tried booting again, but this time I edited the boot options in GRUB to add nomodeset. This time, the laptop only booted to a black screen. Ctrl+Alt+F2 took me to a prompt, I tried startx from there, but X didn't start, complaining that it wanted kernel mode setting back. I can not seem to find any option to disable one graphics chip or the other in the UEFI setup menus. Laptop : HP Pavilion 15-E004AU. The CPU : AMD A6-4400M APU with Radeon(tm) HD Graphics The graphics chip : AMD Radeon HD 7520G + 8670M Dual Graphics. The Ubuntu version : 13.10, 64 bit. Thanks. EDIT: I tried 12.04.3 LTS, it managed to bring the desktop up. There are severe graphics glitches after about two minutes though.

    Read the article

  • You are or will be a laid off programmer - what do you do a year ago, right now, tomorrow, and next week?

    - by Adam Davis
    Many programmers, software engineers, and other technology professionals are out of work, facing layoffs, or are unprepared for layoffs though they feel secure right now. What should every programmer do right now (even if secure in their current job) to prepare them for layoffs down the road? If your boss came to your cubicle while you read this and laid you off: What would you do immediately after? What would you do tomorrow? What would you do next week? It obvious that one should always have an up to date resume, always get recommendations from people when they see you at your best (not when you're looking for a new job), etc. What are the things, step by step, that every programmer should do (or should consider doing) long before they are laid off, when they're laid off, and shortly after being laid off? This is a question with many possible facets. While I want to encourage discussion to center around programming career based answers, please reconsider before downvoting someone because they're thinking in terms of how they're going to prevent going into debt. Bonus catch-22 type question: You can study a new language or technology while out of work, but most places want you to have more than 1-2 months experience in a working environment, not just from a learning exercise. Is it worthwhile to place a priority on new (ideally in demand) skills, or should you instead hone existing skills?

    Read the article

  • Identity Management: The New Olympic Sport

    - by Naresh Persaud
    How Virgin Media Lit Up the London Tube for the Olympics with Oracle If you are at Open World and have an interest in Identity Management, this promises to be an exciting session. Wed, October 3rd Session CON3957: Delivering Secure Wi-Fi on the Tube as an Olympics Legacy from London 2012 Session Time: 11:45am-12:45pm Session Location: Moscone West L3, Room 3003 Speakers: Perry Banton - IT Architect, Virgin Media                    Ben Bulpett - Director, aurionPro SENA In this session, Virgin Media, the U.K.'s first combined provider of broadband, TV, mobile, and home phone services, shares how it is providing free secure Wi-Fi services to the London Underground, using Oracle Virtual Directory and Oracle Entitlements Server, leveraging back-end legacy systems that were never designed to be externalized. As an Olympics 2012 legacy, the Oracle architecture will form a platform to be consumed by other Virgin Media services such as video on demand. Click here for more information.

    Read the article

  • Configuring permissions with Bastille

    - by Lucio
    I was using Bastille to improve the security of OS and I found the next question there I don't know if I should answer for YES or NOT: Questions: Would you like to set more restrictive permissions on the administration utilities? Explanation: In general, the default file permissions set by most vendors are fairly secure. To make them more secure, though, you can remove non-root user access to some administrator functions. If you choose this option, you'll be changing the permissions on some common system administration utilities so that they're not readable or executable by users other than root. These utilities (which include linuxconf, fsck, ipconfig, runlevel and portmap) are ones that most users could never have a need to access. This option will increase your system security, but there's a chance it will inconvenience your users. My users: When I installed Ubuntu I had create a user (admin), then I was able to create another user (people) but I cannot change the permissions of this user. Questions: The user there I am using like admin it's not the root, right? The effects of this option will affect to the two users (admin & people) or just to people?

    Read the article

  • The Importance of Fully Specifying a Problem

    - by Alan
    I had a customer call this week where we were provided a forced crashdump and asked to determine why the system was hung. Normally when you are looking at a hung system, you will find a lot of threads blocked on various locks, and most likely very little actually running on the system (unless it's threads spinning on busy wait type locks). This vmcore showed none of that. In fact we were seeing hundreds of threads actively on cpu in the second before the dump was forced. This prompted the question back to the customer: What exactly were you seeing that made you believe that the system was hung? It took a few days to get a response, but the response that I got back was that they were not able to ssh into the system and when they tried to login to the console, they got the login prompt, but after typing "root" and hitting return, the console was no longer responsive. This description puts a whole new light on the "hang". You immediately start thinking "name services". Looking at the crashdump, yes the sshds are all in door calls to nscd, and nscd is idle waiting on responses from the network. Looking at the connections I see a lot of connections to the secure ldap port in CLOSE_WAIT, but more interestingly I am seeing a few connections over the non-secure ldap port to a different LDAP server just sitting open. My feeling at this point is that we have an either non-responding LDAP server, or one that is responding slowly, the resolution being to investigate that server. Moral When you log a service ticket for a "system hang", it's great to get the forced crashdump first up, but it's even better to get a description of what you observed to make to believe that the system was hung.

    Read the article

  • OpenWorld 2011 Video Index

    - by Chris Kawalek
    We did quite a few virtualization videos this year at Oracle OpenWorld 2011. You can find all these and more on our YouTube channel. Virtualization Wrapup Adam Hawley discusses the Oracle virtualization presence at Oracle OpenWorld 2011. http://www.youtube.com/oraclevirtualization#p/f/2/53_SQYljqN4 Oracle Applications on iPad Brad Lackey shows how you can access Oracle Applications on iPad. http://www.youtube.com/oraclevirtualization#p/f/9/3Ug5km3uxEQ Thinkquest.org and Oracle VM Dan Herrup describes how Thinkquest.org is using Oracle VM to help kids learn how to solve real world problems with computer technology. http://www.youtube.com/oraclevirtualization#p/f/6/Bw-km5kqzEo Avaya and Oracle Virtualization See Oracle desktop virtualization in action at Avaya's booth. http://www.youtube.com/oraclevirtualization#p/f/4/xIHRIijEPkM Eco-Features of Sun Ray Clients Michael Dann shows off the Sun Ray 3 Plus and talks about the eco benefits of Oracle's extremely low power consumption client device for desktop virtualization. http://www.youtube.com/oraclevirtualization#p/f/3/ulArHGe1OmM Application and Desktop Access with Oracle Secure Global Desktop Watch Jeff Harvey do a quick demo of Oracle Secure Global Desktop accessing Oracle Applications. http://www.youtube.com/oraclevirtualization#p/f/5/g_ikA7dwh0g Oracle VM VirtualBox for VDI Andy Hall describes how enterprises leverage Oracle VM VirtualBox as part of their VDI deployments. http://www.youtube.com/oraclevirtualization#p/f/8/WmkeYlzgnZ8 TechCast Live: The Coolest Virtualization Products Interview with Andy Hall about the desktop virtualization portfolio. http://www.youtube.com/oraclevirtualization#p/f/7/VMkrAhZ83AA

    Read the article

  • Secunia Personal Software Inspector (PSI) 2.0

    - by TATWORTH
    Secunia Personal Software Inspector is now available in a updated version that is free for personnal use. The home page says "The Secunia PSI is aFREE security tool designed to detectvulnerable andout-dated programs and plug-ins which expose your PC to attacks. Attacks exploiting vulnerable programs and plug-ins are rarely blocked by traditional anti-virus and are therefore increasingly "popular" among criminals. The only solution to block these kind of attacks is to apply security updates, commonly referred to as patches. Patches are offered free-of-charge by most software vendors, however, finding all these patches is a tedious and time consuming task. Secunia PSI automates this and alerts you when your programs and plug-ins require updating to stay secure. Download the Secunia PSI now and secure your PC today - free-of-charge." I have used this for some time on my home PC and have found it to be very useful in identifying required updates. I use Google Chrome but I found that whenever a new version is issued, the old version is not de-installed. Secunia PSI helps me to locate them and get rid of them.

    Read the article

  • OAuth2 vs Public API

    - by Adam Tannon
    My understanding of OAuth (2.0) is that its a software stack and protocol to allow 2+ web apps to share information about a single end user. User A is a member of Site B and Site C; Site B wants to fetch some data from Site C about User A, and this is where OAuth steps in. So first off, if this assessment is incorrect, please begin by clarifying this for me and correcting me! Assuming I'm on the right track, then I guess I'm not seeing the need for OAuth to begin with (!). I'm sure I'm just not seeing the "forest through the trees" here, but the way I see it, couldn't Site C just expose a public API that Site B could use to fetch the same data (sans OAuth)? If Site C required user credentials to access the data, could this public API just use HTTPS for secure transport and require username/password as a part of each API call? Again, I'm sure I'm missing something, but I'm just not understanding why I would need OAuth when a secure, public API written and exposed by Site C seems more than capable of delivering what Site B needs regarding User A. In general, I'm looking for a set of guidelines to go by when deciding to choose between using OAuth for my web apps or just writing my own web service ( exposing public API). Thanks in advance!

    Read the article

  • windows 8 + Ubuntu dual boot

    - by Jack Yuan
    I installed Ubuntu 13.04 on Windows 8. Yes I can access both of them, but the process is kind of long. In BIOS, EFI is for Windows 8, legacy support is for Ubuntu. If I choose EFI first, the startup just go straight to Win8 without offering me a choice. If I choose legacy first, the starup will offer me a choice between win8 and ubuntu. But I can only choose Ubuntu. If i choose win8, there will be a mistake(file missing under configuration). That is to say, every time i wanna switch to another OS, I have to go into BIOS and change the priority settings. I heard something about secure boot might be the cause of this situation. But the thing is that there is not even an option called "secure boot" in my BIOS, which means i cannot disable it. All I want is that an option menu appears everytime i turn on my computer so i can easily choose what OS I want for today. Can anyone help me plz? Thank you very much!!

    Read the article

  • Interesting fault attempting to install Ubuntu 12.0.4.3 OR 13.10 on MSI GS70 2OD-001US

    - by cjaredrun
    Attempting to install Ubuntu on an MSI GS70 2OD-001US notebook pre-installed with Windows 8. For the record this is what I have attempted: All cases were attempted with Ubuntu 12.0.4.3 AND 13.10 images via USB drives Case 1 - Attempt no changes and simply boot from USB Result: It seems to want to work initially. The Ubuntu logo pops up and the dots start moving. After a few seconds the dots freeze and this screen is shown: http://i.imgur.com/C7pyMRk.jpg Case 2 - Start playing with BIOS - In Windows turning off fast boot - In BIOS turning off Secure boot - In BIOS turning off UEFI and selecting LEGACY Result: Brought to a black screen with a - blinking at me. Case 3 - Alternating BIOS settings - In Windows turning off fast boot - In BIOS turning off Secure boot - in BIOS turning off UEFI and selecting UEFI with CSM Result: Brought to a black screen no other indication of change. It's pretty frustrating to feel locked out of a laptop that I have paid good money for, I'm sure I am not alone in that case. It does however appear that there has been some success, so I am hopeful that someone might be able to help me. FYI: Dual booting is not necessary for me, if that helps at all... I'm not sure if this is a reasonable option, but if I completely wipe clean the hdds, no particle of Windows at all, would this still be a problem? Also would opening up the laptop and replacing the HDDS with brand new ones be a solution as well? Thanks for any input or suggestions.

    Read the article

  • Approach to Authenticate Clients to TCP Server

    - by dab
    I'm writing a Server/Client application where clients will connect to the server. What I want to do, is make sure that the client connecting to the server is actually using my protocol and I can "trust" the data being sent from the client to the server. What I thought about doing is creating a sort of hash on the client's machine that follows a particular algorithm. What I did in a previous version was took their IP address, the client version, and a few other attributes of the client and sent it as a calculated hash to the server, who then took their IP, and the version of the protocol the client claimed to be using, and calculated that number to see if they matched. This works ok until you get clients that connect from within a router environment where their internal IP is different from their external IP. My fix for this was to pass the client's internal IP used to calculate this hash with the authentication protocol. My fear is this approach is not secure enough. Since I'm passing the data used to create the "auth hash". Here's an example of what I'm talking about: Client IP: 192.168.1.10, Version: 2.4.5.2 hash = 2*4*5*1 * (1+9+2) * (1+6+8) * (1) * (1+0) Client Connects to Server client sends: auth hash ip version Server calculates that info, and accepts or denies the hash. Before I go and come up with another algorithm to prove a client can provide data a server (or use this existing algorithm), I was wondering if there are any existing, proven, and secure systems out there for generating a hash that both sides can generate with general knowledge. The server won't know about the client until the very first connection is established. The protocol's intent is to manage a network of clients who will be contributing data to the server periodically. New clients will be added simply by connecting the client to the server and "registering" with the server. So a client connects to the server for the first time, and registers their info (mac address or some other kind of unique computer identifier), then when they connect again, the server will recognize that client as a previous person and associate them with their data in the database.

    Read the article

  • Making HTML5 videos stored on AWS S3 **difficult** to download (because I cant make it impossible)

    - by Jimmery
    I am building a website that hosts video's stored on AWS's S3 service. The videos are played thru a HTML5 player we have built. Ive just been asked to make sure "nobody can steal our video's". Now I know that if you really don't want something stolen, don't put it up on the internet. However I just need to secure these videos as good as possible, the videos need to at the very least resist someone going thru the source code and trying to download them manually. One option available to me is to completely rebuild the video player in flash. This is not ideal, for several reasons, notably because I would also then have to build an App for mobile devices to be able to view this site. So I am looking for other options. I have heard about using a token to make the file available only during certain times. I have heard of using a separate file to serve the videos that sits between the HTML5 page and the video file. I am also having a look at IAM, the Secure AWS Access Control, in the hopes AWS can solve this problem for me. Can anyone here recommend any of these options? Or perhaps suggest other options available to me? Any help would be greatly appreciated.

    Read the article

  • Oracle ADF Mobile

    - by rituchhibber
    We are happy to announce that Oracle ADF Mobile is now available for our customers.Oracle ADF Mobile enables developer to build applications that install and run on both iOS and Android devices from one source code.Development is done with JDeveloper and ADF and leverages Java and HTML5 technologies, while keeping the same visual and declarative approach ADF is known for.Please Click here to read more about the Oracle ADF Mobile release and learn more on our OTN Page. Feature Highlights: Java - Oracle brings a Java VM embedded with each application so you can develop all your business logic in the platform neutral language you know and love! (Yes, even iOS!) JDBC - Since we give you Java, we also provide JDBC along with a SQLite driver and engine that also supports encryption out of the box. Multi-Platform - Truly develop your application only once and deploy to multiple platforms. iOS and Android platforms are supported for both phone and tablet. Flexible - You can decide how to implement the UI: Use existing server-based UI framework like JSF. Use your own favorite HTML5 framework like JQuery. Use our declarative HTML5 component set provided with the framework. Device Feature Access - You can get access to device features from either Java or JavaScript to invoke features like camera, GPS, email, SMS, contacts, etc. Secure - ADF Mobile provides integrated security that works with your server back-end as well. Whether you’re using remote URLs, local HTML or AMX, you can secure any/all of your features with a single consistent login page. Since we also give you SQLite encryption, we are assured that your data is safe. Rapid - Using the same development techniques that ADF developers are already used to, you can quickly create mobile applications without ever learning another language!ADF Mobile XML or AMX for short, provides all the normal input and layout controls you expect and we also add charts/maps/gauges along with it to provide a very comprehensive UI controls. You can also mix and match any of the three for ultimate flexibility!

    Read the article

  • The Top Ten Security Top Ten Lists

    - by Troy Kitch
    As a marketer, we're always putting together the top 3, or 5 best, or an assortment of top ten lists. So instead of going that route, I've put together my top ten security top ten lists. These are not only for security practitioners, but also for the average Joe/Jane; because who isn't concerned about security these days? Now, there might not be ten for each one of these lists, but the title works best that way. Starting with my number ten (in no particular order): 10. Top 10 Most Influential Security-Related Movies Amrit Williams pulls together a great collection of security-related movies. He asks for comments on which one made you want to get into the business. I would have to say that my most influential movie(s), that made me want to get into the business of "stopping the bad guys" would have to be the James Bond series. I grew up on James Bond movies: thwarting the bad guy and saving the world. I recall being both ecstatic and worried when Silicon Valley-themed "A View to A Kill" hit theaters: "An investigation of a horse-racing scam leads 007 to a mad industrialist who plans to create a worldwide microchip monopoly by destroying California's Silicon Valley." Yikes! 9. Top Ten Security Careers From movies that got you into the career, here’s a top 10 list of security-related careers. It starts with number then, Information Security Analyst and ends with number one, Malware Analyst. They point out the significant growth in security careers and indicate that "according to the Bureau of Labor Statistics, the field is expected to experience growth rates of 22% between 2010-2020. If you are interested in getting into the field, Oracle has many great opportunities all around the world.  8. Top 125 Network Security Tools A bit outside of the range of 10, the top 125 Network Security Tools is an important list because it includes a prioritized list of key security tools practitioners are using in the hacking community, regardless of whether they are vendor supplied or open source. The exhaustive list provides ratings, reviews, searching, and sorting. 7. Top 10 Security Practices I have to give a shout out to my alma mater, Cal Poly, SLO: Go Mustangs! They have compiled their list of top 10 practices for students and faculty to follow. Educational institutions are a common target of web based attacks and miscellaneous errors according to the 2014 Verizon Data Breach Investigations Report.    6. (ISC)2 Top 10 Safe and Secure Online Tips for Parents This list is arguably the most important list on my list. The tips were "gathered from (ISC)2 member volunteers who participate in the organization’s Safe and Secure Online program, a worldwide initiative that brings top cyber security experts into schools to teach children ages 11-14 how to protect themselves in a cyber-connected world…If you are a parent, educator or organization that would like the Safe and Secure Online presentation delivered at your local school, or would like more information about the program, please visit here.” 5. Top Ten Data Breaches of the Past 12 Months This type of list is always changing, so it's nice to have a current one here from Techrader.com. They've compiled and commented on the top breaches. It is likely that most readers here were effected in some way or another. 4. Top Ten Security Comic Books Although mostly physical security controls, I threw this one in for fun. My vote for #1 (not on the list) would be Professor X. The guy can breach confidentiality, integrity, and availability just by messing with your thoughts. 3. The IOUG Data Security Survey's Top 10+ Threats to Organizations The Independent Oracle Users Group annual survey on enterprise data security, Leaders Vs. Laggards, highlights what Oracle Database users deem as the top 12 threats to their organization. You can find a nice graph on page 9; Figure 7: Greatest Threats to Data Security. 2. The Ten Most Common Database Security Vulnerabilities Though I don't necessarily agree with all of the vulnerabilities in this order...I like a list that focuses on where two-thirds of your sensitive and regulated data resides (Source: IDC).  1. OWASP Top Ten Project The Online Web Application Security Project puts together their annual list of the 10 most critical web application security risks that organizations should be including in their overall security, business risk and compliance plans. In particular, SQL injection risks continues to rear its ugly head each year. Oracle Audit Vault and Database Firewall can help prevent SQL injection attacks and monitor database and system activity as a detective security control. Did I miss any?

    Read the article

  • How to find domain registrar and DNS hosting with good DNSSEC support?

    - by rsp
    Simplified problem I want to buy a domain and make a website that is fully secured with DNSSEC. Background I've been hearing about the insecurity of DNS for years. I've watched all of the talks by Dan Kaminsky and others from DNS exploits to The future of DNS Security Panel. I knew that using DNS without security is a disaster waiting to happen. I followed the development of the DNSSEC standard. I celebrated the key signing ceremony. Everything was on the right track to finally have a secure DNS system in place. And now more than 2 years later I wanted to just do what everyone said I should do: use DNSSEC for a new domain. So I need a domain registrar and a DNS hosting service that supports DNSSEC. Surprisingly it is not that easy to even find out who does support DNSSEC. It was actually much easier to find info on DNSSEC two years ago when everyone was going to support DNSSEC Real Soon Now but now years passed and I hardly see any progress done. I just hope that I was just looking in the wrong places and someone here will explain all of the doubts. I hope that other people who want to have a secure website will also find this question useful. What is needed registrar and DNS servers with full DNSSEC support for .com domains What is not needed IPv6 support Web hosting anything more What I found out so far Go Daddy offers Premium DNS service for additional $36 per year that lets you "Secure up to 5 domains with DNSSEC". easyDNS has DNSSEC available in Beta across all service levels (you need to enable the "beta" flag in configuration) but it doesn't seem to be production ready and judging from the lack of updates it isn't a feature of highest priority (the last update from March 2011 on the easyDNS blog). Name.com - according to The Register (US domain registrar does IPv6, DNSSEC) it has DNSSEC support since 2010 but right now (October 2012) I couldn't find anything related to DNSSEC on their website. Dynadot that is very often recommended doesn't support DNSSEC Namecheap that is also often recommended doesn't support DNSSEC. The support answer from 2011 suggested that it was being added but in 2012 still no ETA is given to customers. DynDNS was supposed to support DNSSEC, I found a link explaining DNSSEC support but it gives 404 Not Found page and offers a search box - when searching for DNSSEC I get "No results were found for your query." GKG was recommended online for DNSSEC support but it's hard to find any information on the level of DNSSEC support - there is a brief explanation on what is DNSSEC and how to sign Delegation Signer records in their FAQ but no information about the level of actual support can be found. Ask Slashdot: Which Registrars Support DNSSEC? from July 2011 - Answers list Go Daddy, DynDNS, GKG, Name.com as registrars that support DNSSEC but: see above. Related questions How to find web hosting that meets my requirements? What is needed to add DNSSEC to my site? DNS hosting better managed by Domain provider or Hosting provider? Registrar with good security, DNS hosting, and DNSSEC and IPv6 resolvers? In no. 1 no one is ever mentioning DNS at all. In no. 2 answers only mention the .se TLD, there are very few answers and they seem very outdated. In no. 3 one answer says "On projects that demand higher security, I might look for a web host that supports DNSSEC" but no more information is provided. The only relevant answers are in no. 4 where easyDNS is recommended by someone who has never used them personally. Meanwhile, as of October 2012, the support of DNSSEC is described as "in beta" on the easyDNS feature list. Another one recommends SiteGround but searching their site for DNSSEC returns no results. Other answers recommend web hosting providers that don't meet the requirement of DNSSEC support. Also the question mentioned above lists 9 very specific requirements other than only DNSSEC (like eg. HTTP-only login cookies, two-factor authentications, no DNS record limits, DNS statistics of queries/day, audit trails etc.) which might have excluded many possible recommendations if one is only interested in DNSSEC support. Conclusions I thought that by the end of 2012 the support of DNSSEC among domain registrars and DNS providers would be nearly universal. I am shocked that the support seems virtually nonexistent. Is this a result of some serious problems with the DNSSEC adoption? Or is it just not a hot topic and no one bothers anymore? According to the DNSSEC Scoreboard roughly about 0.1% of .com domains support DNSSEC. Could that be caused by the lack of DNSSEC support among registrars and DNS providers, is the information too hard to find or maybe no one cares? There is even no "dnssec" tag here. Questions The information is surprisingly hard to find. That is why I am asking for first-hand experience and personal recommendations. Has anyone here actually set up a website with DNSSEC, from the domain registration to the configuration of DNS servers? Can anyone recommend any of the registrars mentioned above? Can anyone recommend any registrar not mentioned above?

    Read the article

  • Demystified - BI in SharePoint 2010

    - by Sahil Malik
    Ad:: SharePoint 2007 Training in .NET 3.5 technologies (more information). Frequently, my clients ask me if there is a good guide on deciphering the seemingly daunting choice of products from Microsoft when it comes to business intelligence offerings in a SharePoint 2010 world. These are all described in detail in my book, but here is a one (well maybe two) page executive overview. Microsoft Excel: Yes, Microsoft Excel! Your favorite and most commonly used in the world database. No it isn’t a database in technical pure definitions, but this is the most commonly used ‘database’ in the world. You will find many business users craft up very compelling excel sheets with tonnes of logic inside them. Good for: Quick Ad-Hoc reports. Excel 64 bit allows the possibility of very large datasheets (Also see 32 bit vs 64 bit Office, and PowerPivot Add-In below). Audience: End business user can build such solutions. Related technologies: PowerPivot, Excel Services Microsoft Excel with PowerPivot Add-In: The powerpivot add-in is an extension to Excel that adds support for large-scale data. Think of this as Excel with the ability to deal with very large amounts of data. It has an in-memory data store as an option for Analysis services. Good for: Ad-hoc reporting and logic with very large amounts of data. Audience: End business user can build such solutions. Related technologies: Excel, and Excel Services Excel Services: Excel Services is a Microsoft SharePoint Server 2010 shared service that brings the power of Excel to SharePoint Server by providing server-side calculation and browser-based rendering of Excel workbooks. Thus, excel sheets can be created by end users, and published to SharePoint server – which are then rendered right through the browser in read-only or parameterized-read-only modes. They can also be accessed by other software via SOAP or REST based APIs. Good for: Sharing excel sheets with a larger number of people, while maintaining control/version control etc. Sharing logic embedded in excel sheets with other software across the organization via REST/SOAP interfaces Audience: End business users can build such solutions once your tech staff has setup excel services on a SharePoint server instance. Programmers can write software consuming functionality/complex formulae contained in your sheets. Related technologies: PerformancePoint Services, Excel, and PowerPivot. Visio Services: Visio Services is a shared service on the Microsoft SharePoint Server 2010 platform that allows users to share and view Visio diagrams that may or may not have data connected to them. Connected data can update these diagrams allowing a visual/graphical view into the data. The diagrams are viewable through the browser. They are rendered in silverlight, but will automatically down-convert to .png formats. Good for: Showing data as diagrams, live updating. Comes with a developer story. Audience: End business users can build such solutions once your tech staff has setup visio services on a SharePoint server instance. Developers can enhance the visualizations Related Technologies: Visio Services can be used to render workflow visualizations in SP2010 Reporting Services: SQL Server reporting services can integrate with SharePoint, allowing you to store reports and data sources in SharePoint document libraries, and render these reports and associated functionality such as subscriptions through a SharePoint site. In SharePoint 2010, you can also write reports against SharePoint lists (access services uses this technique). Good for: Showing complex reports running in a industry standard data store, such as SQL server. Audience: This is definitely developer land. Don’t expect end users to craft up reports, unless a report model has previously been published. Related Technologies: PerformancePoint Services PerformancePoint Services: PerformancePoint Services in SharePoint 2010 is now fully integrated with SharePoint, and comes with features that can either be used in the BI center site definition, or on their own as activated features in existing site collections. PerformancePoint services allows you to build reports and dashboards that target a variety of back-end datasources including: SQL Server reporting services, SQL Server analysis services, SharePoint lists, excel services, simple tables, etc. Using these you have the ability to create dashboards, scorecards/kpis, and simple reports. You can also create reports targeting hierarchical multidimensional data sources. The visual decomposition tree is a new report type that lets you quickly breakdown multi-dimensional data. Good for: Mostly everything :), except your wallet – it’s not free! But this is the most comprehensive offering. If you have SharePoint server, forget everything and go with performance point. Audience: Developers need to setup the back-end sources, manageability story. DBAs need to setup datawarehouses with cubes. Moderately sophisticated business users, or developers can craft up reports using dashboard designer which is a click-once App that deploys with PerformancePoint Related Technologies: Excel services, reporting services, etc.   Other relevant technologies to know about: Business Connectivity Services: Allows for consumption of external data in SharePoint as columns or external lists. This can be paired with one or more of the above BI offerings allowing insight into such data. Access Services: Allows the representation/publishing of an access database as a SharePoint 2010 site, leveraging many SharePoint features. Reporting services is used by Access services. Secure Store Service: The SP2010 Secure store service is a replacement for the SP2007 single sign on feature. This acts as a credential policeman providing credentials to various applications running with SharePoint. BCS, PerformancePoint Services, Excel Services, and many other apps use the SSS (Secure Store Service) for credential control. Comment on the article ....

    Read the article

  • Oracle Delivers Latest Release of Oracle Enterprise Manager 12c

    - by Scott McNeil
    Richer Service Catalog for Database and Middleware as a Service; Enhanced Database and Middleware Management Help Drive Enterprise-Scale Private Cloud Adoption News Summary IT organizations are adopting private clouds as a stepping-stone to business-driven, self-service IT. Successful implementations hinge on the ability to efficiently deploy and manage cloud services at enterprise scale. Having a complete cloud management solution integrated with an enterprise-class technology stack is a fundamental requirement for IT. Oracle Enterprise Manager 12c Release 4 meets that requirement by helping businesses become more agile and responsive, while reducing cost, complexity, and risk. News Facts Oracle Enterprise Manager 12c Release 4, available today, lets organizations rapidly adopt Oracle-based, enterprise-scale private clouds. New capabilities provide advanced technology stack management, secure database administration, and enterprise service governance, enabling Oracle customers and partners to maximize database and application performance and drive innovation using self-service IT platforms. The enhancements have been driven by customers and the growing Oracle Enterprise Manager Ecosystem, comprised of more than 750 Oracle PartnerNetwork (OPN) Specialized partners. Oracle and its partners and customers have built over 140 plug-ins and connectors for Oracle Enterprise Manager. Watch the video highlights. Automation for Broader Cloud Services Oracle Enterprise Manager 12c Release 4 allows for a rapid enterprise-wide adoption of database, middleware and infrastructure services in the private cloud, driven by an enhanced API-enabled service catalog. The release features “push button” style provisioning of complete environments such as SOA and Oracle Active Data Guard, and fast data cloning that enables rapid deployment and testing of enterprise applications. Out-of-the-box capabilities to detect data and configuration vulnerabilities provide enhanced cloud service governance along with greater operational control through a flexible and extensible showback mechanism. Enhanced Database Management A new performance warehouse enables predictive database diagnostics and trend analysis and helps identify database problems before they occur. New enterprise data-governance capabilities enhance security by helping systematically discover and protect sensitive data. Step-by-step orchestration of upgrades with the ability to rollback changes enables faster adoption of Oracle Database 12c. Expanded Fusion Middleware Management A new consolidated view of Oracle Fusion Middleware 12c deployments with a guided management capability lets administrators apply best management practices to diverse middleware environments and identify performance issues quickly. A Java VM Diagnostics as a Service feature allows governed access to diagnostics data for IT workers across multiple disciplines for accelerated DevOps resolutions of defects and performance optimization. New automated provisioning for SOA lets middleware administrators perform mass SOA provisioning with ease. Superior Enterprise-Grade Management Private roles and preferred credentials have been added to Oracle Enterprise Manager to provide additional fine-grained security for organizations with complex access control requirements. A new security console provides a single point of control for managing the security of Oracle Enterprise Manager environments. Support for the latest industry standard SNMP v3 protocol, including encryption, enables more secure heterogeneous management. “Smart monitoring” adapts to observed environmental changes and adds self-management capabilities to help Oracle Enterprise Manager run at peak performance, while demanding less IT supervision. Supporting Quotes “Lawrence Livermore National Laboratory has a strong tradition of technology breakthroughs and leadership. As a member of Oracle’s Customer Advisory Board for Oracle Enterprise Manager, we have consistently provided feedback and guidance in the areas of enterprise-scale cloud, self-diagnosability, and secure administration for the product,” said Tim Frazier, CIO, NIF and Photon Sciences, Lawrence Livermore National Laboratory. “We intend to take advantage of the Release 4 features that support enterprise-scale availability and fine-grained security capabilities for private cloud deployments.” “IDC's most recent CloudTrack survey shows that most enterprises plan to adopt hybrid cloud architectures over the next three years,” said Mary Johnston Turner, Research Vice President, Enterprise System Management Software, IDC. “These organizations plan to deploy a wide range of workloads into cloud environments including mission critical database and middleware services that require high levels of fault tolerance and disaster recovery. Such capabilities were traditionally custom configured for each application but cloud offers the possibility to incorporate such properties within the service definition, enabling organizations to adopt cloud without compromise. With the latest release of Oracle Enterprise Manager 12c, Oracle is providing customers with an out-of-the-box experience for delivering highly-resilient cloud services for databases and applications.” “Since its inception, Oracle has been leading the way in innovative, scalable and high performance solutions for the enterprise. With this release of Oracle Enterprise Manager, we are extending this leadership by providing enterprise-scale capabilities for planning, delivering, and managing private clouds. We call this ‘zero-to-cloud – accelerated.’ These enhancements help our customers to expedite their adoption of cloud computing and prepares them for the next generation of self-service IT,” said Prakash Ramamurthy, senior vice president of Systems and Cloud Management at Oracle. Supporting Resources Oracle Enterprise Manager 12c Video: Cerner Delivers High Performance Private Cloud Video: BIAS Achieves Outstanding Results with Private Cloud Press Release Stay Connected: Twitter | Facebook | YouTube | Linkedin | Newsletter Download the Oracle Enterprise Manager 12c Mobile app

    Read the article

< Previous Page | 38 39 40 41 42 43 44 45 46 47 48 49  | Next Page >