Search Results

Search found 2451 results on 99 pages for 'guest additions'.

Page 81/99 | < Previous Page | 77 78 79 80 81 82 83 84 85 86 87 88  | Next Page >

  • Mobile Apps: An Ongoing Revolution

    - by Steve Walker
    a guest post from Suhas Uliyar, VP Mobile Strategy, Product Management, Oracle The rise of smartphone apps have proved transformational for businesses, increasing the productivity of employees while simultaneously creating some seriously cool end user experiences. But this is a revolution that is only just beginning. Over the next few years, apps will change everything about the way enterprises work as well as overhauling the experiences of customers. The spark for this revolution is simplicity. Simplicity has already proved important for the front-end of apps, which are now often as compelling and intuitive as consumer apps. Businesses will encourage this trend, both to further increase employee productivity and to attract ‘digital natives’ (as employees and customers). With the variety of front-end development tools available already, this should be a simple mission for developers to accomplish – but front-end simplicity alone is not enough for the enterprise mobile revolution. Without the right content even the most user-friendly app is useless. Yet when it comes to integrating apps with ‘back-end’ systems to enable this content, developers often face a complex, costly and time-consuming task. Then there is security: how can developers strike a balance between complying with enterprise security policies and keeping the user experience simple? Complexity has acted as a brake on innovation, with integration and security compliance swallowing enterprise resources. This is why the simplification of integration, security and scalability is so important: it frees time and money for revolutionary innovation. The key is to put in place a complete and unified SOA integration platform that runs across the entire enterprise and enables organizations to easily integrate and connect applications across IT environments. The platform must also be capable of abstracting apps from the underlying OS and enabling a ‘write-once, run- anywhere’ capability for mobile devices - essential for BYOD environments and integrating third-party apps. Mobile Back-end-as-a-Service can also be very important in streamlining back-end integration. Mobile services offered through the cloud can simplify mobile application development with a standard approach to dealing with complex server-side programming and integration issues. This allows the business to innovate at its own pace while providing developers with a choice of tools to speed development and integration. Finally, there is security, which must be done in a way that encourages users to make the most of their mobile devices and applications. As mobile users, we want convenience and that is why we generally approve of businesses that adopt BYOD policies. Enterprises can safely encourage BYOD as they can separate, protect, and wipe corporate applications by installing a secure ‘container’ around corporate applications on any mobile device. BYOD management also means users’ personal applications and data can be kept separate from the enterprise information – giving them the confidence they need to embrace the use of their devices for corporate apps. Enterprises that place mobility at the heart of what they do will fundamentally transform their businesses and leap ahead of the competition. As businesses take to mobile platforms that simplify integration, security and scalability we will see a blossoming of innovation that will drive new levels of user convenience and create new ways of working that we are only beginning to imagine.

    Read the article

  • A "First" at Oracle OpenWorld

    - by Kathryn Perry
    A guest post by Adam May, Director, Fusion CRM, Oracle Applications Development There are always firsts at OpenWorld. These firsts keep the conference fresh and are the reason people come back year after year. An important first this year is our Fusion CRM customers who are using the product and deriving real benefit from Fusion CRM. Everyone can learn from and interact with them -- including us!  We love talking to customers, especially those who are using our solutions in unexpected ways because they challenge us! At previous OpenWorlds, we presented our overall Fusion vision and our plans for Fusion CRM. Those presentations helped customers plan their strategies and map out their new release uptakes. Fast forward to March of this year when the first Fusion CRM customer went live. Since then we've watched the pace of go-lives accelerate every single month. Now we're at the threshold of another OpenWorld -- with over 45,000 attendees, 2,500 sessions and LOTS of other activities. To avoid having our customers curl into a ball with sensory overload, we designed a Focus On Document to outline the most important Fusion CRM activities. Here are some of the highlights: Anthony Lye's "Oracle Fusion Customer Relationship Management: Overview/Strategy/Customer Experiences/Roadmap" on Monday at 3:15 p.m. The CRM Pavilion, open in Moscone West from Monday through Wednesday; features our strategic Fusion CRM partners and provides live demonstrations of their capabilities General Session: "Oracle Fusion CRM--Improving Sales Effectiveness, Efficiency, and Ease of Use" on Tuesday at 11:45 a.m.; features Anthony Lye and Deloitte "Meet the Fusion CRM Experts" on Tuesday at 5:00 p.m.; this session gives customers the opportunity to interact one-on-one with Fusion experts divided into eight categories of expertise CRM Social Reception on Tuesday from 6-8 p.m.; there's no better way to spend the early evening than discussing Fusion CRM with Oracle experts and strategic partners over appetizers and drinks Wednesday night is Oracle's Customer Appreciation event; enjoy Pearl Jam, Kings of Leon, etc. beginning at 7:30 p.m. at Treasure Island Be sure to drink plenty of water before sleeping Wednesday night and don't stay out too late because we have lots of great content on Thursday; at the top of the list is "Oracle Fusion Social CRM Strategy and Roadmap: Future of Collaboration and Social Engagement" at 11:15 a.m. We hope you have a fantastic experience at OpenWorld 2012! And here's a little video treat to whet your appetite: http://www.youtube.com/user/FusionAppsAtOracle

    Read the article

  • Reading a ZFS USB drive with Mac OS X Mountain Lion

    - by Karim Berrah
    The problem: I'm using a MacBook, mainly with Solaris 11, but something with Mac OS X (ML). The only missing thing is that Mac OS X can't read my external ZFS based USB drive, where I store all my data. So, I decided to look for a solution. Possible solution: I decided to use VirtualBox with a Solaris 11 VM as a passthrough to my data. Here are the required steps: Installing a Solaris 11 VM Install VirtualBox on your Mac OS X, add the extension pack (needed for USB) Plug your ZFS based USB drive on your Mac, ignore it when asked to initialize it. Create a VM for Solaris (bridged network), and before installing it, create a USB filter (in the settings of your Vbox VM, go to Ports, then USB, then add a new USB filter from the attached device "grey usb-connector logo with green plus sign")  Install a Solaris 11 VM, boot it, and install the Guest addition check with "ifconfg -a" the IP address of your Solaris VM Creating a path to your ZFS USB drive In MacOS X, use the "Disk Utility" to unmount the USB attached drive, and unplug the USB device. Switch back to VirtualBox, select the top of the window where your Solaris 11 is running plug your ZFS USB drive, select "ignore" if Mac OS invite you to initialize the disk In the VirtualBox VM menu, go to "Devices" then "USB Devices" and select from the dropping menu your "USB device" Connection your Solaris VM to the USB drive Inside Solaris, you might now check that your device is accessible by using the "format" cli command If not, repeat previous steps Now, with root privilege, force a zpool import -f myusbdevicepoolname because this pool was created on another system check that you see your new pool with "zpool status" share your pool with NFS: share -F NFS /myusbdevicepoolname Accessing the USB ZFS drive from Mac OS X This is the easiest step: access an NFS share from mac OS Create a "ZFSdrive" folder on your MacOS desktop from a terminal under mac OS: mount -t nfs IPadressofMySoalrisVM:/myusbdevicepoolname  /Users/yourusername/Desktop/ZFSdrive et voila ! you might access your data, on a ZFS USB drive, directly from your Mountain Lion Desktop. You might play with the share rights in order to alter any read/write rights as needed. You might activate compression, encryption inside the Solaris 11 VM ...

    Read the article

  • Hadoop, NOSQL, and the Relational Model

    - by Phil Factor
    (Guest Editorial for the IT Pro/SysAdmin Newsletter)Whereas Relational Databases fit the world of commerce like a glove, it is useless to pretend that they are a perfect fit for all human endeavours. Although, with SQL Server, we’ve made great strides with indexing text, in processing spatial data and processing markup, there is still a problem in dealing efficiently with large volumes of ephemeral semi-structured data. Key-value stores such as Cassandra, Project Voldemort, and Riak are of great value for ephemeral data, and seem of equal value as a data-feed that provides aggregations to an RDBMS. However, the Document databases such as MongoDB and CouchDB are ideal for semi-structured data for which no fixed schema exists; analytics and logging are obvious examples. NoSQL products, such as MongoDB, tackle the semi-structured data problem with panache. MongoDB is designed with a simple document-oriented data model that scales horizontally across multiple servers. It doesn’t impose a schema, and relies on the application to enforce the data structure. This is another take on the old ‘EAV’ problem (where you don’t know in advance all the attributes of a particular entity) It uses a clever replica set design that allows automatic failover, and uses journaling for data durability. It allows indexing and ad-hoc querying. However, for SQL Server users, the obvious choice for handling semi-structured data is Apache Hadoop. There will soon be an ODBC Driver for Apache Hive .and an Add-in for Excel. Additionally, there are now two Hadoop-based connectors for SQL Server; the Apache Hadoop connector for SQL Server 2008 R2, and the SQL Server Parallel Data Warehouse (PDW) connector. We can connect to Hadoop process the semi-structured data and then store it in SQL Server. For one steeped in the culture of Relational SQL Databases, I might be expected to throw up my hands in the air in a gesture of contempt for a technology that was, judging by the overblown journalism on the subject, about to make my own profession as archaic as the Saggar makers bottom knocker (a potter’s assistant who helped the saggar maker to make the bottom of the saggar by placing clay in a metal hoop and bashing it). However, on the contrary, I find that I'm delighted with the advances made by the NoSQL databases in the past few years. Having the flow of ideas from the NoSQL providers will knock any trace of complacency out of the providers of Relational Databases and inspire them into back-fitting some features, such as horizontal scaling, with sharding and automatic failover into SQL-based RDBMSs. It will do the breed a power of good to benefit from all this lateral thinking.

    Read the article

  • Why would more CPU cores on virtual machine slow compile times?

    - by Sid
    [edit#2] If anyone from VMWare can hit me up with a copy of VMWare Fusion, I'd be more than happy to do the same as a VirtualBox vs VMWare comparison. Somehow I suspect the VMWare hypervisor will be better tuned for hyperthreading (see my answer too) I'm seeing something curious. As I increase the number of cores on my Windows 7 x64 virtual machine, the overall compile time increases instead of decreasing. Compiling is usually very well suited for parallel processing as in the middle part (post dependency mapping) you can simply call a compiler instance on each of your .c/.cpp/.cs/whatever file to build partial objects for the linker to take over. So I would have imagined that compiling would actually scale very well with # of cores. But what I'm seeing is: 8 cores: 1.89 sec 4 cores: 1.33 sec 2 cores: 1.24 sec 1 core: 1.15 sec Is this simply a design artifact due to a particular vendor's hypervisor implementation (type2:virtualbox in my case) or something more pervasive across more VMs to make hypervisor implementations more simpler? With so many factors, I seem to be able to make arguments both for and against this behavior - so if someone knows more about this than me, I'd be curious to read your answer. Thanks Sid [edit:addressing comments] @MartinBeckett: Cold compiles were discarded. @MonsterTruck: Couldn't find an opensource project to compile directly. Would be great but can't screwup my dev env right now. @Mr Lister, @philosodad: Have 8 hw threads, using VirtualBox, so should be 1:1 mapping without emulation @Thorbjorn: I have 6.5GB for the VM and a smallish VS2012 project - it's quite unlikely that I'm swapping in/out trashing the page file. @All: If someone can point to an open source VS2010/VS2012 project, that might be a better community reference than my (proprietary) VS2012 project. Orchard and DNN seem to need environment tweaking to compile in VS2012. I really would like to see if someone with VMWare Fusion also sees this (for VMWare vs VirtualBox compartmentalization) Test details: Hardware: Macbook Pro Retina CPU : Core i7 @ 2.3Ghz (quad core, hyper threaded = 8 cores in windows task manager) Memory : 16 GB Disk : 256GB SSD Host OS: Mac OS X 10.8 VM type: VirtualBox 4.1.18 (type 2 hypervisor) Guest OS: Windows 7 x64 SP1 Compiler: VS2012 compiling a solution with 3 C# Azure projects Compile times measure by VS2012 plugin called 'VSCommands' All tests run 5 times, first 2 runs discarded, last 3 averaged

    Read the article

  • Parallel Port Problem in 12.04

    - by Frank Oberle
    I have a “dumb” printer attached to a parallel port in my machine which works fine under the “other” resident operating system (from Redmond) on the same machine. I recently added Ubuntu 12.04 as a dual boot on the machine, but Ubuntu doesn't seem to recognize the parallel port at all. All I need to set up a printer is a really plain-vanilla fixed pitch text-only generic driver, which is present, but no parallel ports show up. (The other printers, all on USB ports, seem to work just fine). Following what appeared to me to be the most reasonable of the many conflicting pieces of advice on the web, here's what I did: I added the following lines to /etc/modules parport_pc ppdev parport Then, after rebooting, I checked to see that the lines were still present, and they were. I ran dmesg | grep par and got the following references in the output that seemed like they might have to do with the parallel port: [ 14.169511] parport_pc 0000:03:07.0: PCI INT A -> GSI 21 (level, low) -> IRQ 21 [ 14.169516] PCI parallel port detected: 9710:9805, I/O at 0xce00(0xcd00), IRQ 21 [ 14.169577] parport0: PC-style at 0xce00 (0xcd00), irq 21, using FIFO [PCSPP,TRISTATE,COMPAT,ECP] [ 14.354254] lp0: using parport0 (interrupt-driven). [ 14.571358] ppdev: user-space parallel port driver [ 16.588304] type=1400 audit(1347226670.386:5): apparmor="STATUS" operation="profile_load" name="/usr/lib/cups/backend/cups-pdf" pid=964 comm="apparmor_parser" [ 16.588756] type=1400 audit(1347226670.386:6): apparmor="STATUS" operation="profile_load" name="/usr/sbin/cupsd" pid=964 comm="apparmor_parser" [ 16.673679] type=1400 audit(1347226670.470:7): apparmor="STATUS" operation="profile_load" name="/usr/lib/lightdm/lightdm/lightdm-guest-session-wrapper" pid=1010 comm="apparmor_parser" [ 16.675252] type=1400 audit(1347226670.470:8): apparmor="STATUS" operation="profile_load" name="/usr/lib/telepathy/mission-control-5" pid=1014 comm="apparmor_parser" [ 16.675716] type=1400 audit(1347226670.470:9): apparmor="STATUS" operation="profile_load" name="/usr/lib/telepathy/telepathy-*" pid=1014 comm="apparmor_parser" [ 16.676636] type=1400 audit(1347226670.474:10): apparmor="STATUS" operation="profile_replace" name="/usr/lib/cups/backend/cups-pdf" pid=1015 comm="apparmor_parser" [ 16.677124] type=1400 audit(1347226670.474:11): apparmor="STATUS" operation="profile_replace" name="/usr/sbin/cupsd" pid=1015 comm="apparmor_parser" [ 1545.725328] parport0: ppdev0 forgot to release port I have no idea what any of that means, but the line “parport0: ppdev0 forgot to release port ” seems unusual. I was still unable to add a printer for my old clunker, so I tried the direct approach, typing echo “Hello” > /dev/lp0 and received a Permission denied message. I then tried echo “Hello” > /dev/parport0 which didn't give me any message at all, but still didn't print anything. Running the command sudo /usr/lib/cups/backend/parallel gives the following: direct parallel:/dev/lp0 "unknown" "LPT #1" "" "" Checking the permissions for /dev/parport0, Owner, Group, and Other are all set to read and write. crw-rw---- 1 root lp 6, 0 Sep 9 16:37 /dev/lp0 crw-rw-rw- 1 root lp 99, 0 Sep 9 16:37 /dev/parport0 The output of the command lpinfo -v includes the following line: direct parallel:/dev/lp0 I've read several web postings that seem to suggest this has been a problem for several years, but the bug reports were closed because there wasn't enough information to address the issue (shades of Microsoft!). Any suggestions as to what I might be missing here?

    Read the article

  • Managing Regulated Content in WebCenter: USDM and Oracle Offer a New Part 11 Compliant Solution for Life Sciences

    - by Michael Snow
    Guest post today provided by Oracle partner, USDM  Regulated Content in WebCenterUSDM and Oracle offer a new Part 11 compliant solution for Life Sciences (White Paper) Life science customers now have the ability to take advantage of all of the benefits of Oracle’s WebCenter Content, a global leader in Enterprise Content Management.   For the past year, USDM has been developing best practice compliance solutions to meet regulated content management requirements for 21 CFR Part 11 in WebCenter Content. USDM has been an expert in ECM for life sciences since 1999 and in 2011, certified that WebCenter was a 21CFR Part 11 compliant content management platform (White Paper).  In addition, USDM has built Validation Accelerators Packs for WebCenter to enable life science organizations to quickly and cost effectively validate this world class solution.With the Part 11 certification, Oracle’s WebCenter now provides regulated life science organizations  the ability to manage REGULATORY content in WebCenter, as well as the ability to take advantage of ALL of the additional functionality of WebCenter, including  a complete, open, and integrated portfolio of portal, web experience management, content management and social networking technology.  Here are a few screen shot examples of Part 11 functionality included in the product: E-Sign, E-Sign Rendor, Meta Data History, Audit Trail Report, and Access Reporting. Gone are the days that life science companies have to spend millions of dollars a year to implement, maintain, and validate ECM systems that no longer meet the ever changing business and regulatory requirements.  Life science companies now have the ability to use WebCenter Content, an ECM system with a substantially lower cost of ownership and unsurpassed functionality.Oracle has been #1 in life sciences because of their ability to develop cost effective, easy-to-use, scalable solutions which help increase insight and efficiency to drive growth for their customers.  Adding a world class ECM solution to this product portfolio allows life science organizations the chance to get rid of costly ECM systems that no longer meet their needs and use WebCenter, part of the Oracle Fusion Technology stack, with their other leading enterprise applications.USDM provides:•    Expertise in Life Science ECM Business Processes•    Prebuilt Life Science Configuration in WebCenter •    Validation Accelerator Packs for WebCenterUSDM is very proud to support Oracle’s expanding commitment to Life Sciences…. For more information please contact:  [email protected] Oracle will be exhibiting at DIA 2012 in Philadelphia on June 25-27. Stop by our booth (#2825) to learn more about the advantages of a centralized ECM strategy and see the Oracle WebCenter Content solution, our 21 CFR Part 11 compliant content management platform.

    Read the article

  • Oracle Virtualization Friday Spotlight - November 8, 2013

    - by Monica Kumar
    Hands-on Private Cloud Simulator In One Hour Submitted by: Doan Nguyen, Senior Principal Product Marketing Director My aeronautics instructor used to say, "you can’t appreciate flying until you take flight." To clarify, this is not about gearing up in a flying squirrel suit and hopping off a cliff (topic for another blog!) but rather about flying an airplane. The idea is to get hands-on with the controls at the cockpit and experience flight before you actually fly a real plane. After the initial 40 hours of flight time, the concept sank in and it really made sense.This concept is what inspired our technical experts to put together the hands-on lab for a private cloud deployment and management self-service model. Yes, we are comparing the lab to a flight simulator! Let’s look at the parallels: To get trained to fly, starting in the simulator gets you off the ground quicker. There is no need to have a real plane to begin with. In a hands-on lab, there is no need for a real server, with networking and real storage installed. All you need is your laptop The simulator is pre-configured, pre-flight check done. Similarly, in a hands-on lab, Oracle VM and Oracle Enterprise Manager are pre-configured and assembled using Oracle VM VirtualBox as the container. Software installations are not needed. After time spent training at the controls, you can really appreciate the practical experience of flying. Along the same lines, the hands-on lab is a guided learning path, without the encumbrances of hardware, software installation, so you can learn about cloud deployment and management.  However, unlike the simulator training, your time investment with the lab is only about an hour and not 40 hours! This hands-on lab takes you through private cloud deployment and management using Oracle VM and  Oracle Enterprise Manager Cloud Control 12c in an Infrastructure as a service IaaS model. You will first configure the IaaS cloud as the cloud administrator and then deploy guest virtual machines (VMs) as a self-service user. Then you are ready to take flight into the cloud! Why not step into the cockpit now!

    Read the article

  • Setting up Ubuntu on my mother's computer

    - by idealmachine
    Intended use My mother had an old Compaq desktop computer running Windows 98, which she used for occasional Web browsing and playing cards. The name of her card game is Hoyle Card Games 3. Although I had to repair it several times over the last 10 years, it worked fine until it finally died at the end of last year. Hardware specifications A relative brought up a newer computer soon afterward: Operating system: Windows XP Asus K8N motherboard (with broken on-board sound; getting a sound card) Athlon 64? processor (don't remember the clock speed) 512 MB RAM Hope the graphics card works... Replacement sound card will be one of: Ensoniq ES1370 AudioPCI Diamond Monster Sound MX300 (Aureal chipset) Sound Blaster Audigy 2 SE Peripherals HP Scanjet 3400c scanner (USB connected) HP LaserJet multi-function printer (parallel port connected, and printing works with a PCL driver) Same serial mouse as old computer Question I had set up an SSH/VNC connection to allow for remotely working out problems. Or so I thought. A month later, the computer would not boot, rendering the SSH connection useless and an OS reinstall necessary. Unfortunately, I have neither the original Windows disc nor the product key. Unless I were to pay $200 for a full Windows 7 Home Premium license for my computer, I would not be able to re-install Windows XP on hers. I consider myself an advanced Linux user, having used Debian for years. So here are my questions. I have only one day to decide whether to use Ubuntu or buy Windows: A quick search leads me to believe all the hardware listed above is supposed to work with Linux, but am I mistaken? Would Ubuntu/Xubuntu suffice (specify which one if it matters), or would I be better off paying the $200 necessary for Windows XP? Is the card game likely to run on Wine? I believe the minimum system requirement is Windows 95. Failing Wine compatibility, will VirtualBox run fast enough on such a computer (Windows 98 as the guest OS)? Are there any free card games just as good? She plays mainly Bridge, Poker, and Solitaire. Is there any "Large Fonts" option for those with poor vision? The lack of it would be a big disadvantage. BONUS: Although I would probably replace the old mouse upon a move to Ubuntu, is it even possible to get a serial mouse working?

    Read the article

  • Your Cinnamon Roll & Morning Coffee: Powered by Oracle Enterprise Manager

    - by Ruma Sanyal
    1024x768 Truth be told, as I was getting my morning coffee today, I was pondering the recent election results more than Oracle [there, I said it]. But then an email from Glen Hawkins from the Enterprise Management team hit my Inbox and I started viewing this video. It was about the world’s largest convenience store chain, 7-Eleven, focusing on creating the best Digital Guest Experience (DGE) for their customers. Turns out that Oracle Enterprise Manager (OEM) powers 7-Eleven’s DGE Middleware Platform as a Service solution that consists of Oracle SOA Suite, Exalogic, and Exadata. “We need to present a consistent view of 7-Eleven across all our endpoints: 10,000 stores & various digital entities like our websites and apps”, said Ronald Clanton, the DGE Program Director for 7-Eleven. As 7-Eleven was rolling out a loyalty program with mobile support across multiple geos, it had many complex business & technical requirements, including supporting a wide variety of different apps, 10M guests in NA alone, ability to support high speed transactions, and very aggressive timelines. A key requirement was shortening the cycle for provisioning new environments. Whereas with other vendors this would take a few weeks, Oracle consulting showed them how with OEM provisioning new environments would take half a day, which was quite impressive. 7-Eleven has started to roll out this new program and are delighted to report that some provisioning cycles are as low as 10 minutes which includes provisioning the full Oracle SOA suite, Exalogic and more. They are delighted with OEM’s reporting capabilities and customization thereof. Watch the video to see for yourself. Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";}

    Read the article

  • Cloud Infrastructure has a new standard

    - by macoracle
    I have been working for more than two years now in the DMTF working group tasked with creating a Cloud Management standard. That work has culminated in the release today of the Cloud Infrastructure Management Interface (CIMI) version 1.0 by the DMTF. CIMI is a single interface that a cloud consumer can use to manage their cloud infrastructure in multiple clouds. As CIMI is adopted by the cloud vendors, no more will you need to adapt client code to each of the proprietary interfaces from these multiple vendors. Unlike a de facto standard where typically one vendor has change control over the interface, and everyone else has to reverse engineer the inner workings of it, CIMI is a de jure standard that is under change control of a standards body. One reason the standard took two years to create is that we factored in use cases, requirements and contributed APIs from multiple vendors. These vendors have products shipping today and as a result CIMI has a strong foundation in real world experience. What does CIMI allow? CIMI is both a model for the resources (computing, storage networking) in the cloud as well as a RESTful protocol binding to HTTP. This means that to create a Machine (guest VM) for example, the client creates a “document” that represents the Machine resource and sends it to the server using HTTP. CIMI allows the resources to be encoded in either JavaScript Object Notation (JSON) or the eXentsible Markup Language (XML). CIMI provides a model for the resources that can be mapped to any existing cloud infrastructure offering on the market. There are some features in CIMI that may not be supported by every cloud, but CIMI also supports the discovery of which features are implemented. This means that you can still have a client that works across multiple clouds and is able to take full advantage of the features in each of them. Isn’t it too early for a standard? A key feature of a successful standard is that it allows for compatible extensions to occur within the core framework of the interface itself. CIMI’s feature discovery (through metadata) is used to convey to the client that additional features that may be vendor specific have been implemented. As multiple vendors implement such features, they become candidates to add the future versions of CIMI. Thus innovation can continue in the cloud space without being slowed down by a lowest common denominator type of specification. Since CIMI was developed in the open by dozens of stakeholders who are already implementing infrastructure clouds, I expect to CIMI being adopted by these same companies and others over the next year or two. Cloud Customers who can see the benefit of this standard should start to ask their cloud vendors to show a CIMI implementation in their roadmap.  For more information on CIMI and the DMTF's other cloud efforts, go to: http://dmtf.org/cloud

    Read the article

  • The Connected Company: WebCenter Portal Activity Streams

    - by Michael Snow
    Guest post by Mitchell Palski, Oracle Staff Sales Consultant Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-family:"Calibri","sans-serif"; mso-ascii- mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi- mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Social media is sure to have made its way into your company or government organization. Whether its discussion threads, blog posts, Facebook-style profile-pages, or just a simple Instant Messenger application; in one way or another, your employees are connected. What are the objectives of leveraging social media in your organization? Facilitating knowledge transfer More effectively organizing team events Generating inter-community discussions to solve problems Improving resource management Increasing organizational awareness Creating an environment of accountability Do any of the business objectives above stand out to you as needs? If so, consider leveraging the WebCenter Portal Activity Stream as part of your solution. In WebCenter Portal, the Activity Stream feature provides a streaming view of the activities of your connections, actions taken in portals, and business activities that looks a lot like a combined Facebook and Twitter newsfeed. Activity Stream can note when a user: Posts feedback (comments) Uploads a document Creates a new blog, page, event, or announcement Starts a new discussion Streams messages and attachments entered through WebCenter Publisher (similar to Twitter) Through Activity Stream Preferences, you can select which of these activities to show or hide from your personal Activity Stream. Here’s what you get: Real-time stream of activities with in a Portal or sub-Portal increases awareness across your organization or within a working group Complete list of user actions reduces the time-to-find for users that need to interact with the latest activities in your portal Users can publish to their groups when tasks are finished for complete group traceability and accountability, as well as improved resource management. Project discussions and shared documents that require the expertise of someone outside of a working group now get increased visibility across your organization. There’s a reason that commercial Social Media tools like Facebook and Twitter have been so successful – they spread information in an aesthetically appealing and easy to read format.  Strategically placing an Activity Feed within your Portal is analogous to sending your employees a daily newsletter, events calendar, recent documents report, and list of announcements – BUT ALL IN ONE! 

    Read the article

  • "Expecting A Different Result?" (2 of 3 in 'No Customer Left Behind' Series)

    - by Kathryn Perry
    A guest post by David Vap, Group Vice President, Oracle Applications Product Development Many companies already have some type of customer experience initiative in process or one that could be framed as such. The challenge is that the initiatives too often are started in a department silo, don't have the right level of executive sponsorship, or have been initiated without the necessary insight and strategic business alignment. You can't keep doing the same things, give it a customer experience name, and expect a different result. You can't continue to just compete on price or features - that is not sustainable in commoditized markets. And ultimately, investing in technology alone doesn't solve customer experience problems; it just adds to the complexity of them. You need a customer experience strategy and approach on how to execute a customer-centric worldview within your business. To develop this, you must take an outside in journey on how your customers are interacting with your business to establish a benchmark of your customers' experiences. Then you must get cross-functional alignment on what you are trying to achieve, near, mid, and long term. Your execution of that strategy should be based on a customer experience approach: Understand your customer: You need to capture the insights across interactions, channels (including social), and personas to better understand whom to serve, how to serve them, and when to serve them. Not all experiences or customers are equal, so leverage this insight to understand the strategic business objectives you need to address. Then determine which experiences can be improved immediately and which over time to get the result you need. Empower your ecosystem: You need to align your front-line employees with your strategy and give them the power, insight, and tools that allow them to cultivate a culture around strengthening the relationships with your customers. You also need to provide the transparency, access, and collaboration that enable your customers and partners to self serve and self solve and to share with ease. Adapt your business: You need to enable the discipline of agility within your organization and infrastructure so that you can innovate, tailor, and personalize experiences. This needs to be done both reactively from insight and proactively in real time so you can stay ahead of shifting market trends and evolving consumer behaviors. No longer will the old approaches provide the same returns. To compete, differentiate, and win in a world where the customer has the power, you must execute a strategy that is sure to deliver a better brand experience for your customers. Note: This is Part 2 in a three-part series. Part 1 is here. Stop back for Part 3 on November 28.

    Read the article

  • Understanding When Social Interactions Should Be Resolved in Another Channel

    - by Christina McKeon
    Guest Blogger: Aphrodite Brinsmead, Senior Analyst at Ovum Agents need to respond to customers’ social comments and questions quickly and in the right tone. But more importantly, they need to offer resolutions. Customers care most about how long it takes to find information rather than which channel they are using. They choose to use social media because they are comfortable with the channel and it offers a convenient way to communicate. Ideally agents will resolve questions within social media, but they need guidance as to how and when to escalate interactions to a more private channel. First, businesses should assess the way in which customers are using social media to communicate with them and categorize posts into groups: complaints, feedback, technical queries or more general support questions. They should then consider the types of interactions that can easily be handled within social media and those that need to be followed up in another channel. This will be very dependent on the industry. Examples of queries that can be resolved in social media include Shipping pricing and timeframes Outage updates and resolution plans Flight status information Product stock check Technical support videos or forum posts Availability of facilities Both customers and agents need to be educated about the types of questions they can expect to resolve within social media. As the channel matures as a customer service tool, it needs to have value other than just as a forum for complaints. Social customer service agents need the power to start a web chat or phone call Any questions where customers need to divulge personal details in order to get a resolution will need to be addressed in a private channel: a private social message, web chat, email or phone call. Customers should never disclose their date of birth, social security, credit card number, or healthcare records in a public forum. Flight issues, changes to a booking, billing queries or account updates will all need to be completed via a private interaction. Agents responding to questions on social media need the ability to start a web chat or phone call with the customer. The customer doesn’t want to have to repeat their question and the agent should be empowered to connect customer records and access account or billing information. These agents will need to be trained across different channels and should be able to view all customer communications in one application. They also need to follow up questions that began on a public forum in the initial channel to make it clear that the issue was addressed. In order to make this possible, social media needs to be integrated as part of a broader customer service strategy. Irrespective of how many channels are used to complete an interaction, businesses should prioritize customer satisfaction and issue resolution. They need a clear strategy and trained agents that can handle and respond to social interactions. Follow me on Twitter @diteb. 

    Read the article

  • password incorrect 3 times + suspected failed update

    - by Cheese
    I have been lurking your site for the past few hours, and have found myself in a bit of a pickle. Visiting my parents, I discover that neither computer, nor laptop work. Long story short, I've got the laptop working, but have completely fudged up the computer. I am a n00b, but I was at least willing to give it a go. The comp originally had ubuntu 11.10 installed, later updated to 12.04. We have cds for both. I do not understand what the initial problem was for my parents, but somehow when I turned on the computer, it worked for me. Soon after, I was nagged to install the latest updates. So, I spent the next half an hour wondering why the updates kept on asking for 11.04 cdroms, until I realised that you could turn off the cdrom necessity. After doing this via console, I installed some of the smaller updates, before being told to do a partial update. This failed a few times, and ended up freezing whilst reinstalling drivers. After a hard restart I continued to type whatever I could find on the forum into the console. At some point, the console started saying that I had 3 incorrect password inputs, and sudo commands stopped altogether. I found another thread discussing this; but people kept on suggesting changing passwords (which I did to no avail) or other things that made use of sudo (which I am locked out of, although I am technically the admin) I found myself somehow on the Ctrl+Alt+F1 console, and after being utterly confused (and Ctrl+AltF5 failing for me), another hard reset occurred. Somewhere along the way I created a USB start up for 14.04, (but this does not seem to work) Now I am left with an admin (and guest) account that log in but have blank screens (with only the desktop background showing) and I can't do anything in the console because I'm locked out. Interestingly, the console now says that I am running 14.04 although all updates said they had failed. Aside from the obvious lessons I have learnt (don't fiddle about in the console when you have no idea what you're doing "Dog wearing safety glasses "I have no idea what I am doing" GIF would be inserted here ) Is there any way I can redeem this almighty muck up? A million thanks for any help!

    Read the article

  • Increase Availability for Data Center Virtual Environments

    - by Antoinette O'Sullivan
    With Oracle VM, you can increase availability and add flexibility for data center virtual environments. To get started, take training on Oracle VM Server for x86 and Oracle VM Server for SPARC as appropriate for your systems. You can take these live instructor-led courses from your own desk as a live-virtual event or travel to an education center for an in-class event. The Oracle VM Administration: Oracle VM Server for x86 course, in 3 days, teaches you about creating NFS and iSCI repositories, migration, cloning and exercising high availabillity. In-class events already on the schedule include:  Location  Date  Delivery Language  Zagreb, Croatia  11 November 2013  Croatian  Prague, Czech Republic  21 October 2013  Czech  Ballerup, Denmark  26 August 2013  English  Bordeaux, France  18 September 2013  French  Paris, France  9 October 2013  French  Strasbourg, France  11 September 2013  French  Hamburg, Germany  30 Septemeber 2013  German  Munich, Germany  28 October 2013  German  Budapest, Hungary  9 September 2013  Hungarian  Riga, Latvia  30 September 2013  Latvian  Oslo, Norway  16 September 2013  English  Warsaw, Poland  28 October 2013  Polish  Bucharest, Romania  14 October 2013  English  Istanbul, Turkey  23 December 2013  Turkish  Indonesia, Jakarta  19 August 2013  English  Canberra, Australia  4 November 2013  English  Melbourne, Australia  6 November 2013  English  Sydney, Australia  25 November 2013  English  San Francisco, CA, United States  16 September 2013  English  Roseville, MN, United States  21 October 2013  English  St Louis, MO, United States  11 November 2013  English  Reston, VA, United States  31 July 2013  English  Buenos Aires, Argentina  21 August 2013  Spanish The Oracle VM Server for SPARC: Installation and Configuration course, in 2 days, teaches you about configuring control and service domains, creating guest domains, using virtual disks and networks, and migration. In-class events already on the schedule include:  Location  Date  Delivery Language  Budapest, Hungary  12 September 2013  Hungarian  Prague, Czech Republic  9 September 2013  Czech  Colombes, France  7 October 2013  French  Stuttgart, Germany  28 October 2013  German  Madrid, Spain  5 September 2013  Spanish  Istanbul, Turkey 30 September 2013  Turkish   Petaling Jaya, Malaysia 15 August 2013  English   Singapore 5 August 2013  English   Cnaberra, Australia  12 August 2013 English  Melbourne, Australia  30 October 2013 English  Sydney, Australia  26 August 2013 English To register for a course or to learn more about Oracle's virtualization curriculum, go to http://education.oracle.com/virtualization.

    Read the article

  • Upgrading to 9.2 - Info You Can Use (part 2 - Get Hands On Experience)

    - by John Webb
    Our guest blogger, Rebekah Jackson, continues with a series of helpful hints on planning your upgrade to PeopleSoft 9.2. Get Hands-On Experience With a an Easy to Install Demo Image Once you have identified the features you believe can add value (see part 1 of this series here), we recommend that you use our Demo Images to quickly create a PeopleSoft environment where you can try out the new features. The Demo Images are virtual machines that run in VirtualBox.   They contain a fully functioning PeopleSoft environment, including the database, PeopleTools, and the application.  These images can be loaded onto nearly any sized machine that meets the specs outlined in the documentation, including a powerful desktop machine.     The image allows you to very quickly deploy a fully functioning PeopleSoft instance that you can use to explore new features and functions or run a conference room pilot. To take advantage these Demo Images: Go to the Demo Images Home Page (Doc id 1552548.1) on My Oracle Support. Use the " Demo Image Quick Start Guide" and additional documentation links on that page to download and install your desired Demo Image. New Demo Images are posted for each product area approximately every 10 weeks, available shortly after the corresponding patching image. When installed, use the Demo Image to explore the desired features and capabilities. Important Notes: - It is not required to use the Demo Images to evaluate features and functions – any 9.2 instance will support this. We recommend use of the Demo Images because the setup and configuration is dramatically faster than doing a traditional PeopleSoft install. - For those looking to explore new features and capabilities on PeopleSoft 9.1 releases, we have provided virtual machine images using the Oracle Virtual Machine technology. Details and links are available in Oracle’s PeopleSoft Virtualization Products page (Doc id 1538142.1). /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";}

    Read the article

  • Proftpd on Ubuntu - Create directory permission denied (550 ) after upgrade to 9.10

    - by Ian
    Hi all, I am having problems with ProFTPD since I upgraded to Ubuntu 9.10 from 9.04. When I login as my ftp user (userftp) in the terminal I can create dirs fine in their home dir. But when I use ftp as this user permission is denied (550 asl: permission denied) when I try and do the same operation (creating a dir). Uploading files is fine though. I am using the same config for proftpd as I was before, I can't understand what's wrong. Any help appreciated! Config follows: Include /etc/proftpd/modules.conf UseIPv6 on IdentLookups off ServerName "whatever" ServerType inetd DeferWelcome off MultilineRFC2228 on DefaultServer on ShowSymlinks on TimeoutNoTransfer 600 TimeoutStalled 600 TimeoutIdle 1200 DisplayLogin welcome.msg DisplayChdir .message true ListOptions "-l" DenyFilter \*.*/ DefaultRoot ~ Port 21 <IfModule mod_dynmasq.c> </IfModule> MaxInstances 8 User proftpd Group nogroup Umask 022 022 AllowOverwrite on TransferLog /var/log/proftpd/xferlog SystemLog /var/log/proftpd/proftpd.log <IfModule mod_quotatab.c> QuotaEngine off </IfModule> <IfModule mod_ratio.c> Ratios off </IfModule> <IfModule mod_delay.c> DelayEngine on </IfModule> <IfModule mod_ctrls.c> ControlsEngine off ControlsMaxClients 2 ControlsLog /var/log/proftpd/controls.log ControlsInterval 5 ControlsSocket /var/run/proftpd/proftpd.sock </IfModule> <IfModule mod_ctrls_admin.c> AdminControlsEngine off </IfModule> # # My additions # MaxLoginAttempts 5 # # My user config # #VALID LOGINS <Limit LOGIN> AllowUser userftp DenyALL </Limit> <Directory /home/userftp> Umask 022 022 AllowOverwrite off <Limit MKD STOR DELE XMKD RNRF RNTO RMD XRMD> DenyAll </Limit> </Directory> <Directory /home/userftp/upload/> Umask 022 022 AllowOverwrite on <Limit READ> DenyAll </Limit> <Limit STOR CWD MKD RMD DELE> AllowAll </Limit> </Directory>

    Read the article

  • Suggestions for sharing and using data between Ubuntu and Windows 7 dual boot

    - by wizjany
    Note: TL;DR, scroll to bottom for summary. I recently set up my computer for a dual boot between Ubuntu 9.10 and Windows 7. My current drive setup is as follows. | A1 | A2 | | B1 | B2 | B3 | A1: 100 mb, windows 7 "System Reserved" boot partition A2: 230 gb, data section, this one needs to be shared between the operating systems B1: 125 gb, windows 7 OS B2: 123 gb, ubuntu OS B3: 2gb, linux swap space Pretty much i want to have my documents, music, pictures, videos, etc accessible from both operating systems. My first attempt involved making the data (a2) partition NTFS, and moving my home folder from ubuntu to the data partition. However, as I read NTFS does not work nice with permission, and it messed up my home folder. My next idea is one of the following: 1) format the data partition to ext2/3/4 and move my home folder from linux there, and get a driver to read ext partitions in windows 7. The problem with that is that most of the ext drivers/software are not compatible with windows 7 or do not integrate with windows explorer (I really don't want to open a separate software window just to access my data, plus it's probably not compatible with other software.) http://www.fs-driver.org/ looks promising, but I'm not sure how it works with ext4 and windows 7 (not officially supported, when trying in vista compatibility mode, it tells me I need to format the ext drive to use it). My next idea, 2) keep the home folder in ubuntu where it is, but create symlinks for the Documents, Music, etc folders to an NTFS formatted Data (A2) partition, and add those locations to the windows 7 libraries. I'm not totally sure how the permissions would work out, but it should be fine since it's only the documents, music, etc and not the important config files in the rest of /home/user/. Correct me if I'm wrong. Currently, symlinks is my best idea, although i'm not sure how it will work. Any suggestions, additions to my ideas, links, pointers, whatever would be greatly appreciated. Even if it means i should reformat both my drives and repartition (2 250gb drives if you want to suggest a setup for that), I won't be too opposed if that's the best suggestion (I've gone through the format/install/format/reinstall process 5 times over the past 3 days, once more won't hurt me). TL;DR, summary: I have two hard drives. One is partitioned for Ubuntu and Windows 7, the second one I want accessible to both operating systems to store documents, music, pictures, videos, etc. Suggestions on how to set up the data drive please =) P.S. bonus if I can get an apache server document root folder working between the two OS's as well (permissions could become very complicated, so don't worry too much about that) P.P.S. Related question, but data viewing is one way: http://superuser.com/questions/84586/partition-scheme-and-size-for-dual-boot-windows-7-and-ubuntu-9-10-with-separate-p

    Read the article

  • Install XP Mode with VirtualBox Using the VMLite Plugin

    - by Mysticgeek
    Would you like to run XP Mode, but prefer Sun’s VirtualBox for virtualization?  Thanks to the free VMLite plugin, you can quickly and easily run XP Mode in or alongside VirtualBox. Yesterday we showed you one method to install XP Mode in VirtualBox, unfortunately in that situation you lose XP’s activation, and it isn’t possible to reactivate it. Today we show you a tried and true method for running XP mode in VirtualBox and integrating it seamlessly with Windows 7. Note: You need to have Windows 7 Professional or above to use XP Mode in this manner. Install XP Mode Make sure you’re logged in with Administrator rights for the entire process. The first thing you’ll want to do is install XP Mode on your system (link below). You don’t need to install Windows Virtual PC. Go through and install XP Mode using the defaults. Install VirtualBox Next you’ll need to install VirtualBox 3.1.2 or higher if it isn’t installed already. If you have an older version of VirtualBox installed, make sure to update it. During setup you’re notified that your network connection will be reset. Check the box next to Always trust software from “Sun Microsystems, Inc.” then click Install.   Setup only takes a couple of minutes, and does not require a reboot…which is always nice. Install VMLite XP Mode Plugin The next thing we’ll need to install is the VMLite XP Mode Plugin. Again Installation is simple following the install wizard. During the install like with VirtualBox you’ll be asked to install the device software. After it’s installed go to the Start menu and run VMLite Wizard as Administrator. Select the location of the XP Mode Package which by default should be in C:\Program Files\Windows XP Mode. Accept the EULA…and notice that it’s meant for Windows 7 Professional, Enterprise, and Ultimate editions. Next, name the machine, choose the install folder, and type in a password. Select if you want Automatic Updates turned on or not. Wait while the process completes then click Finish.   The VMLite XP Mode will set up to run the first time. That is all there is to this section. You can run XP Mode from within the VMLite Workstation right away. XP Mode is fully activated already, and the Guest Additions are already installed, so there’s nothing else you need to do!  XP Mode is the whole way ready to use. Integration with VirtualBox Since we installed the VMLite Plugin, when you open VirtualBox you’ll see it listed as one of your machines and you can start it up from here.   Here we see VMLite XP Mode running in Sun VirtualBox. Integrate with Windows 7 To integrate it with Windows 7 click on Machine \ Seamless Mode…   Here you can see the XP menu and Taskbar will be placed on top of Windows 7. From here you can access what you need from XP Mode.   Here we see XP running on Virtual Box in Seamless Mode. We have the old XP WordPad sitting next to the new Windows 7 version of WordPad. This works so seamlessly you forget if your working in XP or Windows 7. In this example we have Windows Home Server Console running in Windows 7, while installing MSE from IE 6 in XP Mode. At the top of the screen you will still have access to the VMs controls.   You can click the button to exit Seamless Mode, or simply hit the right “CTRL+L” Conclusion This is a very slick way to run XP Mode in VirtualBox on any machine that doesn’t have Hardware Virtualization. This method also doesn’t lose the XP Mode activation and is actually extremely easy to set up. If you prefer VMware (like we do), Check out how to run XP Mode on machines without Hardware Virtualization capability, and also how to create an XP Mode for Vista and Windows 7 Home Premium. Links Download XP Mode Download VirtualBox Download VMLite XP Mode Plugin for VirtualBox (Site Registration Required) Similar Articles Productive Geek Tips Search for Install Packages from the Ubuntu Command LineHow To Run XP Mode in VirtualBox on Windows 7 (sort of)Install and Use the VLC Media Player on Ubuntu LinuxInstall Monodevelop on Ubuntu LinuxInstall Flash Plugin Manually in Firefox on Vista TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 PCmover Professional Enable Check Box Selection in Windows 7 OnlineOCR – Free OCR Service Betting on the Blind Side, a Vanity Fair article 30 Minimal Logo Designs that Say More with Less LEGO Digital Designer – Free Create a Personal Website Quickly using Flavors.me

    Read the article

  • Install XP Mode with VirtualBox Using the VMLite Plugin

    - by Mysticgeek
    Would you like to run XP Mode, but prefer Sun’s VirtualBox for virtualization?  Thanks to the free VMLite plugin, you can quickly and easily run XP Mode in or alongside VirtualBox. Yesterday we showed you one method to install XP Mode in VirtualBox, unfortunately in that situation you lose XP’s activation, and it isn’t possible to reactivate it. Today we show you a tried and true method for running XP mode in VirtualBox and integrating it seamlessly with Windows 7. Note: You need to have Windows 7 Professional or above to use XP Mode in this manner. Install XP Mode Make sure you’re logged in with Administrator rights for the entire process. The first thing you’ll want to do is install XP Mode on your system (link below). You don’t need to install Windows Virtual PC. Go through and install XP Mode using the defaults. Install VirtualBox Next you’ll need to install VirtualBox 3.1.2 or higher if it isn’t installed already. If you have an older version of VirtualBox installed, make sure to update it. During setup you’re notified that your network connection will be reset. Check the box next to Always trust software from “Sun Microsystems, Inc.” then click Install.   Setup only takes a couple of minutes, and does not require a reboot…which is always nice. Install VMLite XP Mode Plugin The next thing we’ll need to install is the VMLite XP Mode Plugin. Again Installation is simple following the install wizard. During the install like with VirtualBox you’ll be asked to install the device software. After it’s installed go to the Start menu and run VMLite Wizard as Administrator. Select the location of the XP Mode Package which by default should be in C:\Program Files\Windows XP Mode. Accept the EULA…and notice that it’s meant for Windows 7 Professional, Enterprise, and Ultimate editions. Next, name the machine, choose the install folder, and type in a password. Select if you want Automatic Updates turned on or not. Wait while the process completes then click Finish.   The VMLite XP Mode will set up to run the first time. That is all there is to this section. You can run XP Mode from within the VMLite Workstation right away. XP Mode is fully activated already, and the Guest Additions are already installed, so there’s nothing else you need to do!  XP Mode is the whole way ready to use. Integration with VirtualBox Since we installed the VMLite Plugin, when you open VirtualBox you’ll see it listed as one of your machines and you can start it up from here.   Here we see VMLite XP Mode running in Sun VirtualBox. Integrate with Windows 7 To integrate it with Windows 7 click on Machine \ Seamless Mode…   Here you can see the XP menu and Taskbar will be placed on top of Windows 7. From here you can access what you need from XP Mode.   Here we see XP running on Virtual Box in Seamless Mode. We have the old XP WordPad sitting next to the new Windows 7 version of WordPad. This works so seamlessly you forget if your working in XP or Windows 7. In this example we have Windows Home Server Console running in Windows 7, while installing MSE from IE 6 in XP Mode. At the top of the screen you will still have access to the VMs controls.   You can click the button to exit Seamless Mode, or simply hit the right “CTRL+L” Conclusion This is a very slick way to run XP Mode in VirtualBox on any machine that doesn’t have Hardware Virtualization. This method also doesn’t lose the XP Mode activation and is actually extremely easy to set up. If you prefer VMware (like we do), Check out how to run XP Mode on machines without Hardware Virtualization capability, and also how to create an XP Mode for Vista and Windows 7 Home Premium. Links Download XP Mode Download VirtualBox Download VMLite XP Mode Plugin for VirtualBox (Site Registration Required) Similar Articles Productive Geek Tips Search for Install Packages from the Ubuntu Command LineHow To Run XP Mode in VirtualBox on Windows 7 (sort of)Install and Use the VLC Media Player on Ubuntu LinuxInstall Monodevelop on Ubuntu LinuxInstall Flash Plugin Manually in Firefox on Vista TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 PCmover Professional Enable Check Box Selection in Windows 7 OnlineOCR – Free OCR Service Betting on the Blind Side, a Vanity Fair article 30 Minimal Logo Designs that Say More with Less LEGO Digital Designer – Free Create a Personal Website Quickly using Flavors.me

    Read the article

  • Kernel panic while loading Mac OS X on VMWare

    - by Vladimir Gritsenko
    I have Mac OS X 10.6.3 and VMWare 7.1. Trying to run the OS X doesn't work - shortly after booting VMWare announces the machine's death with this pop-up message: The CPU has been disabled by the guest operating system. You will need to power off or reset the virtual machine at this point. The console has a more interesting announcement: The real-time clock was not properly initialized on your system! With a dump of the detected CPU speed (~2.9 GHz), FSB (~94 MHz) and bus ratio (31). Somehow, the code that panics is documented in an accidental .diff file here. Apparently, it commits seppuku if the bus ratio is greater than 30. I underclocked my E6500 to a slower speed, but this apparently didn't make any difference. I can think of two possibilities right now: The CPU info being read is constant, and defines the maximum ability of the CPU. In which case it appears I'm screwed. VMWare presents its own CPU info to the machine, which I can perhaps somehow change. If so, how? If these sound completely off, that's because I'm a real newbie in these matters. Here's hoping you guys can show me the light :-)

    Read the article

  • Windows 7 XP Mode disable time sync

    - by Oskar Duveborn
    So I've tried the trick from Virtual PC 2007, adding the following section to the vmc configuration file: <components> <host_time_sync> <enabled type="boolean">false</enabled> </host_time_sync> </components> Later someone suggested VPC doesn't want the components level so added this instead: <host_time_sync> <enabled type="boolean">false</enabled> <frequency type="integer">15</frequency> <threshold type="integer">10</threshold> </host_time_sync> When I start up XP Mode (Microsoft Virtual PC) it completely ignores any of these two configuration changes and if I change the clock it's instantly reset to the host time again. I've also obviously disabled the Windows Time service but as it's not joined to a domain or set up with a source it shouldn't be involved anyway. I need to test an application over a few midnight passes and thought the XP Mode machine would be perfect, so I didn't have to mess with my workstation clock... is there any way to get the VPC guest to not sync time with the host? This is easy in Hyper-V ;p

    Read the article

  • VMWare Workstation 6.5.5 (64-bit) on Ubuntu 10.04 (64-bit) Odd Problem

    - by Android Eve
    I managed to successfully install VMWare Workstation 6.5.5 (64-bit) on Ubuntu 10.04 (64-bit). It works well and somehow feels faster and snappier than the same exact version on Ubuntu 8.04. However, there is one slight issue, somewhat hurting productivity: When the guest VM is Microsoft Windows (2K, XP), the mouse cursor turn from an arrow to a hand when it hovers over the Task Bar. When the mouse moves, this hand cursor blinks and the system doesn't respond to mouse clicks. When I move the mouse cursor back to the desktop area, it functions normally. That is, the problem exists only in the Task Bar area. Obviously, this makes it very difficult (read: impossible) for me to use the Start Menu, SysTray and the rest of the Task Bar. My workaround for now is to launch programs via their Desktop shortcuts or via the keyboard. Note: The same exact VMWare Workstation 6.5.5 (64-bit) on Ubuntu 8.04 (64-bit) doesn't exhibit this problem. Anyone seen this problem before? Do you know of a solution (or better workaround) to this problem?

    Read the article

  • psexec: "Access is Denied"?

    - by Electrons_Ahoy
    Inspired by my previous question here, I've been experimenting with PSExec. The goal is to trip off some fairly simple scripts / programs on one WindowsXP machine from another, and as PowerShell 2 doesn't yet do remoting on XP, PSexec seems like it'll solve my problems nicely. However, I can't get anything but the "Access is Denied" error. Here's what I've tried so far: I've got a pair of WindowsXP MCE machines, networked together in a workgroup without a server or domain controller. I've turned off "simple file sharing" on both machines. Under the security policy, Network Access: Sharing and Security model for local accounts is set to Classic, not Guest for both machines. There is an Administrative user for each computer that I know the passwords to. :) With all that, a command like "> psexec \\otherComputer -u adminUser cmd" prompts for the password (like it should) and then exits with: Couldn't access otherComputer: Access is denied. So, at this point I turn to the community. What step am I missing here?

    Read the article

< Previous Page | 77 78 79 80 81 82 83 84 85 86 87 88  | Next Page >