Search Results

Search found 7777 results on 312 pages for 'my resources'.

Page 230/312 | < Previous Page | 226 227 228 229 230 231 232 233 234 235 236 237  | Next Page >

  • Open World Day 1 Continued

    - by Antony Reynolds
    A Day in the Life of an Oracle OpenWorld Attendee Part II A couple of things I forgot to mention about yesterdays OpenWorld. First I attended a presentation on SOA Suite and Virtualization which explained how Oracle Virtual Assembly Builder (OVAB) can be used to accelerate the deployment of an Enterprise Deployment Guide (EDG) compliant SOA Suite infrastructure.  OVAB provides the ability to introspect a deployed software component such as WebLogic Server, SOA Suite or other components and extract the configuration and package it up for rapid deployment into an Oracle Virtual Machine.  OVAB allows multiple machines to be configured and connections made between the machines and outside resources such as databases.  That by itself is pretty cool and has been available for a while in OVAB.  What is new is that Oracle has done this for an EDG compliant installations and made it available as an OVAB assembly for customers to use, significantly accelerating the deployment of an EDG deployment.  A real help for customers standing up EDG environments, particularly in test, dev and QA environments. The other thing I forgot to mention was the most memorable demo I saw at OpenWorld.  This was done by my co-author Matt Wright who was showcasing the products of his company Rubicon Red.  They showed a really cool application called OneSpot which puts all the information about a single users business processes in one spot!  Apparently a customer suggested the name.  It allows business flows to be defined that map onto events.  As events occur the status of the business flow is updated to reflect the change.  The interface is strongly reminiscent of social media sites and provides a graphical view of business flows.  So how does this differ from BPEL and BPM process flows?  The OneSpot process flow is more like a BAM process flow, it is based on events arriving from multiple sources, and is focused on the clients view of the process, not the actual business process.  This is important because it allows an end user to get a view of where his current business flow is and what actions, if any, are required of him.  This by itself is great, but better still is that OneSpot has a real time updating view of events that have occurred (BAM style no need to refresh the browser).  This means that as new events occur the end user can see them and jump to the business flow or take other appropriate actions.  Under the covers OneSpot makes use of Oracle Human Workflow to provide a forms interface, but this is not the HWF GUI you know!  The HWF GUI screens are much prettier and have more of a social media feel about them due to their use of images and pulling in relevant related information.  If you are at OOW I strongly recommend you visit Matt or John at the Rubicon Red stand and ask, no demand a demo of OneSpot!

    Read the article

  • CISDI Cloud - Industrial Cloud Computing Platform based on Oracle Products

    - by Wenyu Duan
    In today's era, Cloud Computing is becoming integral to the vision and corporate strategy of leading organizations and is often seen as a key business driver to achieve growth and innovation. Headquartered in Chongqing, China, CISDI Engineering Co., Ltd. is a large state-owned engineering company, offering consulting, engineering design, EPC contracting, and equipment integration services to steel producers all over the world. With over 50 years of experience, CISDI offers quality services for every aspect of production for projects in the metal industry and the company has evolved into a leading international engineering service group with 18 subsidiaries providing complete lifecycle for E&C projects. CISDI group delegation led by Mr. Zhaohui Yu, CEO of CISDI Group, Mr. Zhiyou Li, CEO of CISDI Info, Mr. Qing Peng, CTO of CISDI Info and Mr. Xin Xiao, Head of CISDI Info's R&D joined Oracle OpenWorld 2012 and presented a very impressive cloud initiative case in their session titled “E&C Industry Solution in CISDI Cloud - An Industrial Cloud Computing Platform Based on Oracle Products”. CISDI group plans to expand through three phases in the construction of its cloud computing platform: first, it will relocate its existing technologies to Oracle systems, along with establishing private cloud for CISDI; secondly, it will gradually provide mixed cloud services for its subsidiaries and partners; and finally it plans to launch an industrial cloud with a highly mature, secure and scalable environment providing cloud services for customers in the engineering construction and steel industries, among others. “CISDI Cloud” will become the growth engine for the organization to expand its global reach through online services and achieving the strategic objective of being the preferred choice of E&C companies worldwide. The new cloud computing platform is designed to provide access to the shared computing resources pool in a self-service, dynamic, elastic and measurable way. It’s flexible and scalable grid structure can support elastic expansion and sustainable growth, and can bring significant benefits in speed, agility and efficiency. Further, the platform can greatly cut down deployment and maintenance costs. CISDI delegation highlighted these points as the key reasons why the group decided to have a strategic collaboration with Oracle for building this world class industrial cloud - - Oracle’s strategy: Open, Complete and Integrated - Oracle as the only company who can provide engineered system, with complete product chain of hardware and software - Exadata, Exalogic, EM 12c to provide solid foundation for "CISDI Cloud" The cloud blueprint and advanced architecture for industrial cloud computing platform presented in the session shows how Oracle products and technologies together with industrial applications from CISDI can provide end-end portfolio of E&C industry services in cloud. CISDI group was recognized for business leadership and innovative solutions and was presented with Engineering and Construction Industry Excellence Award during Oracle OpenWorld.

    Read the article

  • The Jack LaLanne School of Sysadmins

    - by rickramsey
    Two of my childhood heroes were Tarzan and Jack LaLanne. Tarzan was an obvious choice: what boy wouldn't want to spend his days bungee jumping through the jungle with his own pack of gorillas? Jack Lalanne had a disturbing habit of wearing stretch pants, but he was so damn fit for an old guy that you couldn't help but be impressed. Especially back then, when nobody knew what a dumb bell was, much less Cross-Fit. Here's what he did to celebrate his 70th birthday. Sooner or later we all face a choice in our careers: surrender to the life of a has-been like Bruce Sprinsteen's baseball player or become an unstoppable sysadmin like Jack Lalanne. If you'd rather keep on fighting like Jack, give these resources a look. Brian Bream's blog provides specific suggestions for keeping your skills up to date. The video interviews describe the types of technologies that are challenging what you used to know. Blog: The Old School Sysadmin - A Dying Breed? by Brian Bream "The sysadmin role has been far too dependent on performing repetitive tasks and working in a reactionary mode ... the sysadmin must grow a much larger skill set to be successful. Don’t grow vertically in one technology, grow horizontally amongst many technologies." Just one of the suggestions Brian Bream provides in this excellent blog post. Video: Freeing the Sysadmin From Repetitive Tasks Interview with Marshall Choy Marshall Choy, Director of Optimized Solutions at Oracle was once a sysadmin. And a Solaris engineer. He explains what optimized solutions are, how they are developed and tested, how they handle patching, and how these vertically integrated systems impact the job and duties of a sysadmin. Video: The Oracle Database Appliance Interview with Bob Thome Bob Thome, Senior Director of Product Management, explains what makes the Database Appliance simple, reliable, and affordable, and how it could change the economies and processes of the data center. Video: Why Pinellas County Chose Oracle Exalytics Interview with Gautham Gautham (pronounced like Batman's Gotham) recently led an effort to refresh the Pinellas County hardware systems. He'll explain what they were looking for, why they chose Oracle Exalytics, how they became convinced it was the right decision, and how it changed the way they managed their data center. Video: DTrace for System Administrators Interview with Brendan Gregg This video interview will give you an idea of some of the value-add tasks you can perform when you are freed from the reactive mode that Brian Bream describes in his blog. Brendan Gregg describes the best ways for sysadmins to tune deployed applications to get more performance out of them in their particular computing environment photograph of Ford Mustang GT 500 taken at Gateway Museum copyright by Rick Ramsey -Rick Follow me on: Blog | Facebook | Twitter | Personal Twitter | YouTube | The Great Peruvian Novel

    Read the article

  • Oracle OpenWorld Preview: Get Your Hands Dirty with Oracle WebCenter

    - by Christie Flanagan
    Feel like getting your hands dirty with Oracle WebCenter during Oracle OpenWorld next week?  Roll up your sleeves and sharpen you skills sets by mastering Oracle WebCenter technology in one of our Hand-On Labs.  These labs are self-paced, practical learning sessions where you’re guaranteed to discover new ways to derive maximum benefits from Oracle WebCenter.  Experts will be available in person to answer questions and guide you through each lab. HOL10208 - Add Social Capabilities to Your Enterprise Applications Monday, Oct 1, 12:15 PM - 1:15 PM - Marriott Marquis - Salon 1/2 Oracle Social Network enables you to add real-time collaboration capabilities into your enterprise applications, so that conversations can happen directly within your business systems. In this hands-on lab, you will try out the Oracle Social Network product to collaborate with other attendees, using real-time conversations with document sharing capabilities. Next you will embed social capabilities into a sample Web-based enterprise application, using embedded UI components. Experts will also write simple REST-based integrations, using the Oracle Social Network API to programmatically create social interactions.HOL10194 - Enterprise Content Management Simplified: Oracle WebCenter Content’s Next-Generation UI Tuesday, Oct 2, 11:45 AM - 12:45 PM - Marriott Marquis - Salon 1/2Regardless of the nature of your business, unstructured content underpins many of its daily functions. Whether you are working with traditional presentations, spreadsheets, or text documents—or even with digital assets such as images and multimedia files—your content needs to be accessible and manageable in convenient and intuitive ways to make working with the content easier. Additionally, you need the ability to easily share documents with coworkers to facilitate a collaborative working environment. Come to this session to see how Oracle WebCenter Content’s next-generation user interface helps modern knowledge workers easily manage personal and enterprise documents in a collaborative environment.HOL10207 - Build an Intranet Portal with Oracle WebCenter Tuesday, Oct 2, 1:15 PM - 2:15 PM - Marriott Marquis - Salon 1/2 Wednesday, Oct 3, 3:30 PM - 4:30 PM - Marriott Marquis - Salon 1/2In this hands-on lab, you’ll work with Oracle WebCenter Portal and Oracle WebCenter Content to build out an enterprise portal that maximizes the productivity of teams and individual contributors. Using browser-based tools, you’ll manage site resources such as page styles, templates, and navigation. You’ll edit content stored in Oracle WebCenter Content directly from your portal. You’ll also experience the latest features that promote collaboration, social networking, and personal productivity.HOL10206 - Oracle WebCenter Sites 11g: Transforming the Content Contributor Experience Wednesday, Oct 3, 5:00 PM - 6:00 PM - Marriott Marquis - Salon 1/2Oracle WebCenter Sites 11g makes it easy for marketers and business users to contribute to and manage Websites with the new visual, contextual, and intuitive Web authoring interface. In this hands-on lab, you will create and manage content for a sports-themed Website, using many of the new and enhanced features of the 11g release. See Your Favorite WebCenter Products in Action Visit us in the exhibition hall to see demonstrations of WebCenter products.  Demo pod locations are in Moscone South, Right: Oracle Social Network: S-244 Oracle WebCenter Content: S-246, S245 Oracle WebCenter Sites: S-247 Oracle WebCenter Portal: S-249 More Info: Oracle OpenWorld Oracle WebCenter Focus On Guide Oracle Customer Experience Summit @ OpenWorld

    Read the article

  • How To Uninstall, Disable, and Remove Windows Defender. Also, How Turn it Off

    - by The Geek
    If you’re already running a full anti-malware suite, you might not even realize that Windows Defender is already installed with Windows, and is probably wasting precious resources. Here’s how to get rid of it. Now, just to be clear, we’re not saying that we hate Windows Defender. Some spyware protection is better than none, and it’s built in and free! But… if you are already running something that provides great anti-malware protection, there’s no need to have more than one application running at a time. Disable Windows Defender Unfortunately, Windows Defender is completely built into Windows, and you’re not going to actually uninstall it. What we can do, however, is disable it. Open up Windows Defender, go to Tools on the top menu, and then click on Options. Now click on Administrator on the left-hand pane, uncheck the box for “Use this program”, and click the Save button. You will then be told that the program is turned off. Awesome! If you really, really want to make sure that it never comes back, you can also open up the Services panel through Control Panel, or by typing services.msc into the Start Menu search or run boxes. Find Windows Defender in the list and double-click on it… And then you can change Startup type to Disabled. Now again, we’re not necessarily advocating that you get rid of Windows Defender. Make sure you keep yourself protected from malware! Similar Articles Productive Geek Tips Stop an Application from Running at Startup in Windows VistaRemove "Map Network Drive" Menu Item from Windows Vista or XPManually Remove Skype Extension from FirefoxUninstall, Disable, or Delete Internet Explorer 8 from Windows 7Still Useful in Vista: Startup Control Panel TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 Combine MP3 Files Easily QuicklyCode Provides Cheatsheets & Other Programming Stuff Download Free MP3s from Amazon Awe inspiring, inter-galactic theme (Win 7) Case Study – How to Optimize Popular Wordpress Sites Restore Hidden Updates in Windows 7 & Vista

    Read the article

  • SBUG Session: The Enterprise Cache

    - by EltonStoneman
    [Source: http://geekswithblogs.net/EltonStoneman] I did a session on "The Enterprise Cache" at the UK SOA/BPM User Group yesterday which generated some useful discussion. The proposal was for a dedicated caching layer which all app servers and service providers can hook into, sharing resources and common data. The architecture might end up like this: I'll update this post with a link to the slide deck once it's available. The next session will have Udi Dahan walking through nServiceBus, register on EventBrite if you want to come along. Synopsis Looked at the benefits and drawbacks of app-centric isolated caches, compared to an enterprise-wide shared cache running on dedicated nodes; Suggested issues and risks around caching including staleness of data, resource usage, performance and testing; Walked through a generic service cache implemented as a WCF behaviour – suitable for IIS- or BizTalk-hosted services - which I'll be releasing on CodePlex shortly; Listed common options for cache providers and their offerings. Discussion Cache usage. Different value propositions for utilising the cache: improved performance, isolation from underlying systems (e.g. service output caching can have a TTL large enough to cover downtime), reduced resource impact – CPU, memory, SQL and cost (e.g. caching results of paid-for services). Dedicated cache nodes. Preferred over in-host caching provided latency is acceptable. Depending on cache provider, can offer easy scalability and global replication so cache clients always use local nodes. Restriction of AppFabric Caching to Windows Server 2008 not viewed as a concern. Security. Limited security model in most cache providers. Options for securing cache content suggested as custom implementations. Obfuscating keys and serialized values may mean additional security is not needed. Depending on security requirements and architecture, can ensure cache servers only accessible to cache clients via IPsec. Staleness. Generally thought to be an overrated problem. Thinking in line with eventual consistency, that serving up stale data may not be a significant issue. Good technical arguments support this, although I suspect business users will be harder to persuade. Providers. Positive feedback for AppFabric Caching – speed, configurability and richness of the distributed model making it a good enterprise choice. .NET port of memcached well thought of for performance but lack of replication makes it less suitable for these shared scenarios. Replicated fork – repcached – untried and less active than memcached. NCache also well thought of, but Express version too limited for enterprise scenarios, and commercial versions look costly compared to AppFabric.

    Read the article

  • Step Away From That Computer! You’re Not Qualified to Use It!

    - by Michael Sorens
    Most things tend to come with warnings and careful instructions these days, but sadly not one of the most ubiquitous appliances of all, your computer. If a chainsaw is missing its instructions, you’re well advised not to use it, even though you probably know roughly how it’s supposed to work. I confess, there are days when I feel the same way about computers. Long ago, during the renaissance of the computer age, it was possible to know everything about computers. But today, it is challenging to be fully knowledgeable even in one small area, and most people aren’t as savvy as they like to think. And, if I may borrow from Edwin Abbott Abbott’s classic Flatland, that includes me. And you. Need an example of what I mean? Take a look at almost any recent month’s batch of Windows updates. Just two quick questions for you: Do you need all of those updates? Is it safe to install all of those updates? I do software design and development for a living on Windows and the .NET platform, but I will be quite candid: I often have little clue what the heck some of those updates are going to do or why they are needed. So, if you do not know why they are needed or what they do, how do you know if they are safe? Of course, one can sidestep both questions by accepting Microsoft’s recommended Windows Update setting of “install updates automatically”. That leads you to infer that you need all of them (which is not always the case) and, more significantly, that they are safe. Quite safe. Ah, lest reality intrude upon such a pretty picture! Sadly, there is no such thing as risk-free software installation, and payloads from Windows Update are no exception. Earlier this year, a Windows Secrets Patch Watch article touted this headline: Keep this troublesome kernel update on hold. It discusses KB 2862330, a security update originally published more than 4 months earlier, and yet the article still recommends not installing it! Most people simply do not have the time, resources, or interest, to go about figuring out which updates to install or postpone or skip for safety reasons. Windows Secrets Patch Watch is the best service I have encountered for getting advice, but it is still no panacea and using the service effectively requires a degree of computer literacy that I still think is beyond a good number of people. Which brings us full circle: Step Away From That Computer! You’re Not Qualified to Use It!

    Read the article

  • Keep website and webservices warm with zero coding

    - by oazabir
    If you want to keep your websites or webservices warm and save user from seeing the long warm up time after an application pool recycle, or IIS restart or new code deployment or even windows restart, you can use the tinyget command line tool, that comes with IIS Resource Kit, to hit the site and services and keep them warm. Here’s how: First get tinyget from here. Download and install the IIS 6.0 Resource Kit on some PC. Then copy the tinyget.exe from “c:\program files…\IIS 6.0 ResourceKit\Tools'\tinyget” to the server where your IIS 6.0 or IIS 7 is running. Then create a batch file that will hit the pages and webservices. Something like this: SET TINYGET=C:\Program Files (x86)\IIS Resources\TinyGet\tinyget.exe"%TINYGET%" -srv:dropthings.omaralzabir.com -uri:http://dropthings.omaralzabir.com/ -status:200"%TINYGET%" -srv:dropthings.omaralzabir.com -uri:http://dropthings.omaralzabir.com/WidgetService.asmx?WSDL - status:200 First I am hitting the homepage to keep the webpage warm. Then I am hitting the webservice URL with ?WSDL parameter, which allows ASP.NET to compile the service if not already compiled and walk through all the operations and reflect on them and thus loading all related DLLs into memory and reducing the warmup time when hit. Tinyget gets the servers name or IP in the –srv parameter and then the actual URI in the –uri. I have specified what’s the HTTP response code to expect in –status parameter. It ensures the site is alive and is returning http 200 code. Besides just warming up a site, you can do some load test on the site. Tinyget can run in multiple threads and run loops to hit some URL. You can literally blow up a site with commands like this: "%TINYGET%" -threads:30 -loop:100 -srv:google.com -uri:http://www.google.com/ -status:200 Tinyget is also pretty useful to run automated tests. You can record http posts in a text file and then use it to make http posts to some page. Then you can put matching clause to check for certain string in the output to ensure the correct response is given. Thus with some simple command line commands, you can warm up, do some transactions, validate the site is giving off correct response as well as run a load test to ensure the server performing well. Very cheap way to get a lot done.

    Read the article

  • ath9k driver does not weak up

    - by shantanu
    I know this is common question but i found no suitable answer, so i am asking this again. I installed ubuntu 11.10. I found the bug for ath9k, so set first network boot from BIOS menu. That's worked. I have upgraded to 12.04 yesterday. Now ath9k is creating problem again. First network boot is still enable. ath9k works at start. But failed(connect again and again) after couple of minutes. dmesg show error that it can not weak up in 500us. So i tried #compat-wireless-3.5.1-1. But result is same. I have also added #nohwcrypt=1 option in /etc/modeprob.d/ath9k.conf. Still no luck. I tried #rmmod and then modprobe sudo modprobe ath9k nohwcrypt=1 dmesg shows me error: [ 400.690086] ath9k: Driver unloaded [ 406.214329] ath9k 0000:06:00.0: enabling device (0000 -> 0002) [ 406.214348] ath9k 0000:06:00.0: PCI INT A -> GSI 17 (level, low) -> IRQ 17 [ 406.214368] ath9k 0000:06:00.0: setting latency timer to 64 [ 406.428517] ath9k 0000:06:00.0: Failed to initialize device [ 406.428852] ath9k 0000:06:00.0: PCI INT A disabled [ 406.428887] ath9k: probe of 0000:06:00.0 failed with error -5 dmesg error when driver fail: 355.023521] ath: Chip reset failed [ 355.023524] ath: Unable to reset channel, reset status -22 [ 355.023556] ath: Unable to set channel [ 355.088569] ath: Failed to stop TX DMA, queues=0x10f! [ 355.122708] ath: DMA failed to stop in 10 ms AR_CR=0xffffffff AR_DIAG_SW=0xffffffff DMADBG_7=0xffffffff [ 355.122714] ath: Could not stop RX, we could be confusing the DMA engine when we start RX up [ 355.263962] ath: Chip reset failed [ 355.263966] ath: Unable to reset channel (2437 MHz), reset status -22 [ 358.996063] ath: Failed to wakeup in 500us [ 364.004182] ath: Failed to wakeup in 500us I can not install fresh ubuntu because i have lots of application installed. System : Acer Aspire 4250 AMD dual core 1.6GHZ Atheros Communications Inc. AR9485 Wireless Network Adapter (rev 01) EDITED Now i am in serious problem. No wifi device is not showing in ifconfig or lshw commands. Only ether-net interface shows. I tried (FN + WIFI) several times to enable the device but nothing helps. Now I have installed fresh ubuntu 12.04. Please help lshw -c network: *-network description: Ethernet interface product: 82566DC Gigabit Network Connection vendor: Intel Corporation physical id: 19 bus info: pci@0000:00:19.0 logical name: eth0 version: 02 serial: 00:19:d1:7a:8e:f9 size: 100Mbit/s capacity: 1Gbit/s width: 32 bits clock: 33MHz capabilities: pm msi bus_master cap_list ethernet physical tp 10bt 10bt-fd 100bt 100bt-fd 1000bt-fd autonegotiation configuration: autonegotiation=on broadcast=yes driver=e1000e driverversion=2.0.0-k duplex=full firmware=1.1-0 ip=192.168.1.114 latency=0 link=yes multicast=yes port=twisted pair speed=100Mbit/s resources: irq:45 memory:90300000-9031ffff memory:90324000-90324fff ioport:20c0(size=32) rfkill command does not show anything but no error.

    Read the article

  • Interpolation using a sprite's previous frame and current frame

    - by user22241
    Overview I'm currently using a method which has been pointed out to me is extrapolation rather than interolation. As a result, I'm also now looking into the possibility of using another method which is based on a sprite's position at it's last (rendered) frame and it's current one. Assuming an interpolation value of 0.5 this is, (visually), how I understand it should affect my sprite's position.... This is how I'm obtaining an inerpolation value: public void onDrawFrame(GL10 gl) { // Set/re-set loop back to 0 to start counting again loops=0; while(System.currentTimeMillis() > nextGameTick && loops < maxFrameskip) { SceneManager.getInstance().getCurrentScene().updateLogic(); nextGameTick += skipTicks; timeCorrection += (1000d / ticksPerSecond) % 1; nextGameTick += timeCorrection; timeCorrection %= 1; loops++; tics++; } interpolation = (float)(System.currentTimeMillis() + skipTicks - nextGameTick) / (float)skipTicks; render(interpolation); } I am then applying it like so (in my rendering call): render(float interpolation) { spriteScreenX = (spriteScreenX - spritePreviousX) * interpolation + spritePreviousX; spritePreviousX = spriteScreenX; // update and store this for next time } Results This unfortunately does nothing to smooth the movement of my sprite. It's pretty much the same as without the interpolation code. I can't get my head around how this is supposed to work and I honestly can't find any decent resources which explain this in any detail. My understanding of extrapolation is that when we arrive at the rendering call, we calculate the time between the last update call and the render call, and then adjust the sprite's position to reflect this time (moving the sprite forward) - And yet, this (Interpolation) is moving the sprite back, so how can this produce smooth results? Any advise on this would be very much appreciated. Edit I've implemented the code from OriginalDaemon's answer like so: @Override public void onDrawFrame(GL10 gl) { newTime = System.currentTimeMillis()*0.001; frameTime = newTime - currentTime; if ( frameTime > (dt*25)) frameTime = (dt*25); currentTime = newTime; accumulator += frameTime; while ( accumulator >= dt ) { SceneManager.getInstance().getCurrentScene().updateLogic(); previousState = currentState; t += dt; accumulator -= dt; } interpolation = (float) (accumulator / dt); render(); } Interpolation values are now being produced between 0 and 1 as expected (similar to how they were in my original loop) - however, the results are the same as my original loop (my original loop allowed frames to skip if they took too long to draw which I think this loop is also doing). I appear to have made a mistake in my previous logging, it is logging as I would expect it to (interpolated position does appear to be inbetween the previous and current positions) - however, the sprites are most definitely choppy when the render() skipping happens.

    Read the article

  • Playing NSF music in FMOD.net

    - by Tesserex
    So, as the title says, I want to be able to play NSF files using FMOD, because my project already uses FMOD and I'd rather not replace it. This will involve figuring out how existing players and emulators work and porting it. I haven't yet found an existing player that uses FMOD. My starting point is the MyNes source from http://sourceforge.net/projects/mynes/. There are two big steps between here and what I'm looking for. MyNes plays from a ROM, not NSF. So, I have to rip out the APU and get it to play NSF files. The MyNes APU uses SlimDX, so I have to convert that to FMOD.NET. I am really stuck about how to go about either of these, because I'm not that familiar with audio formats and it's hard finding resources online. So here are a few questions: From what I can tell from the NSF spec at http://kevtris.org/nes/nsfspec.txt, it's just contains the relevant memory section of the ROM, plus the header. If anyone can verify or correct this that would be great. The emulator APU uses data from the rest of the emulator to play, including things like cycle counts. I'm not sure what replaces this in a standalone player. Can't I just load all the music data at once into a stream and play it? Joining #1 and #2, does the header data from the NSF substitute for some of the ROM data in the emulator code? Using FMOD, will I be following the usercreatedsound example for loading a stream? And does this format count as PCM? Specifically MyNes says PCM8. Any tips on loading / playing the stream in FMOD are appreciated. As an aside, I don't really understand the loading / playing sections of the spec I linked at all. It seems to apply to 6502 systems / emulators only and not to my situation. I know it's a long shot for anyone here to have enough experience in this area to help, but anything you can provide is definitely appreciated. A link to an existing .NET library that does this would be even better, but I don't believe one exists.

    Read the article

  • European e-government Action Plan all about interoperability

    - by trond-arne.undheim
    Yesterday, the European Commission released its European eGovernment Action Plan for 2011-2015. The plan includes measures on providing deeper user empowerment, enhancing the Internal Market, more efficiency and effectiveness of public administrations, and putting in place pre-conditions for developing e-government. The Good - Defines interoperability very clearly. Calls interoperability "a pre-condition for cross-border eGovernment services" (a very strong formulation) and says interoperability "is supported by open specifications". - Uses the terminology "open specifications" which, let's face it, is pretty close to "open standards" which is the term the rest of the world would use. - Confirms that Member States are fully committed to the political priorities of the Malmö Declaration (which was all about open standards) including the very strong action: by 2013: All Member States will have incorporated the political priorities of the Malmö Declaration in their national strategies. Such tight Action Plan integration between Commission and Member State priorities has seldom been attempted before, particularly not in a field where European legal competence is virtually non-existent. What we see now, is the subtle force of soft power rather than the rough force of regulation. In this case, it is the Member States who want Europe to take the lead. Very refreshing! Some quotes that show the commitment to interoperability and open specifications: "The emergence of innovative technologies such as "service-oriented architectures" (SOA), or "clouds" of services,  together with more open specifications which allow for greater sharing, re-use and interoperability reinforce the ability of ICT to play a key role in this quest for effficiency in the public sector." (p.4) "Interoperability is supported through open specifications" (p.13) 2.4.1. Open Specifications and Interoperability (p.13 has a whole section dedicated to this important topic. Open specifications and interoperability are nearly 100% interrelated): "Interoperability is the ability of systems and machines to exchange, process and correctly interpret information. It is more than just a technical challenge, as it also involves legal, organisational and semantic aspects of handling  data" (p.13) "standards and  open platforms offer opportunities for more cost-effective use of resources and delivery of services" (p.13). The Bad Shies away from defining open standards, or even open specifications, the EU's preferred term for the key enabler of interoperability. Verdict 90/100, a very respectable score.

    Read the article

  • ArchBeat Top 10 for November 11-17, 2012

    - by Bob Rhubart
    The Top 10 most popular items shared on the OTN ArchBeat Facebook page for the week of November 11-17, 2012. Developing and Enforcing a BYOD Policy Darin Pendergraft's post includes links to a recent Mobile Access Policy Survey by SANS as well as registration information for a Nov 15 webcast featuring security expert Tony DeLaGrange from Secure Ideas, SANS instructor, attorney and technology law expert Ben Wright, and Oracle IDM product manager Lee Howarth. This Week on the OTN Architect Community Homepage Make time to check out this week's features on the OTN Solution Architect Homepage, including: SOA Practitioner Guide: Identifying and Discovering Services Technical article by Yuli Vasiliev on Setting Up, Configuring, and Using an Oracle WebLogic Server Cluster The conclusion of the 3-part OTN ArchBeat Podcast on Future-Proofing your career. WLST Starting and Stopping a WebLogic Environment | Rene van Wijk Oracle ACE Rene van Wijk explores how to start a server with as little input as possible. Cloud Integration White Paper | Bruce Tierney Bruce Tierney shares an overview of Cloud Integration - A Comprehensive Solution, a new white paper he co-authored with David Baum, Rajesh Raheja, Bruce Tierney, and Vijay Pawar. X.509 Certificate Revocation Checking Using OCSP protocol with Oracle WebLogic Server 12c | Abhijit Patil Abhijit Patil's article focuses on how to use X.509 Certificate Revocation Checking Functionality with the OCSP protocol to validate in-bound certificates. Although this article focuses on inbound OCSP validation using OCSP, Oracle WebLogic Server 12c also supports outbound OCSP validation. Update on My OBIEE / Exalytics Books | Mark Rittman Oracle ACE Director Mark Rittman shares several resources related to his books Oracle Business Intelligence 11g Developers Guide and Oracle Exalytics Revealed, including a podcast interview with Oracle's Paul Rodwick. E-Business Suite 12.1.3 Data Masking Certified with Enterprise Manager 12c | Elke Phelps "You can use the Oracle Data Masking Pack with Oracle Enterprise Manager Grid Control 12c to scramble sensitive data in cloned E-Business Suite environments," reports Elke Phelps. There's a lot more information about this announcement in Elke's post. WebLogic Application Server: free for developers! | Bruno Borges Java blogger Bruno Borges shares news about important changes in the license agreement for Oracle WebLogic Server. Agile Architecture | David Sprott "There is ample evidence that Agile Architecture is a primary contributor to business agility, yet we do not have a well understood architecture management system that integrates with Agile methods," observes David Sprott in this extensive post. My iPad & This Cloud Thing | Floyd Teter Oracle ACE Director Floyd Teter explains why the Cloud is making it possible for him to use his iPad for tasks previously relegated to his laptop, and why this same scenario is likely to play out for a great many people. Thought for the Day "In programming, the hard part isn't solving problems, but deciding what problems to solve." — Paul Graham Source: SoftwareQuotes.com

    Read the article

  • Cooking with Wessty: HTML 5 and Visual Studio

    - by David Wesst
    The hardest part about using a new technology, such as HTML 5, is getting to what features are available and the syntax. One way to learn how to use new technologies is to adapt your current development to help you use the technology in comfort of your own development environment. For .NET Web Developers, that environment is usually Visual Studio 2010. This technique intends on showing you how to get HTML 5 Intellisense working in your current version of Visual Studio 2008 or 2010, making it easier for you to start using HTML 5 features in your current .NET web development projects. Quick Note According to the Visual Web Developer team at Microsoft, the Visual Studio 2010 SP1 beta has support for both HTML 5 and CSS 3. If you are willing to try out the bleeding edge update from Microsoft, then you won’t need this technique. --- Ingredients Visual Studio 2008 or 2010 Your favourite HTML 5 compliant browser (e.g. Internet Explorer 9) Administrator privileges, or the ability to install Visual Studio Extensions in your development environment. Directions Download the HTML 5 Intellisense for Visual Studio 2008 and 2010 extension from the Visual Studio Extension Gallery. Install it. Open Visual Studio. Open up a web file, such as an HTML or ASPX file. he HTML Source Editing toolbar should have appeared. (Optional) If it did not appear, you can activate it through the main menu by selecting View, then Toolbars, and then select HTML Source Editing if it does not have a checkbox beside it. (NOTE: If there is a checkbox, then the toolbar is enabled) In the HTML Source Editing toolbar, open up the validation schema drop box, and select HTML 5. Et voila! You now have HTML 5 intellisense enabled to help you get started in adding HTML 5 awesomeness to your web sites and web applications. Optional – Setting HTML 5 Validation Options At this point, you may want to select how Visual Studio shows validation errors. You can do that in the Options Menu. To get to the Options Menu… In the main menu select Tools, then Options. In the Options window, select and expand Text Editor, then HTML, followed by selecting Validation. Resources HTML 5 Intellisense for Visual Studio 2008 and 2010 extenstion Visual Studio Extension Gallery Visual Studio 2010 SP1 Beta This post also appears at http://david.wes.st

    Read the article

  • SQLAuthority News – A Quick Note on @Pluralsight Video – Call Me Maybe Developer Way

    - by pinaldave
    I write a lot about how important learning and training is.  Any of my readers will know that I think the key to success is staying current with your education and taking very opportunity to increase your “tool kit” of skills.  I hope that I have not made the impression that it is all in the employees hands to make sure they are happy and satisfied at their jobs. I also firmly believe that a good boss will make good employees.  A boss who is good at communicating,  and leading, who knows how to nip problem in the bud and allocate resources wisely will have a well-oiled machine.  This means happy employees and a great work environment. It is important to have a healthy work environment because you will not succeed without one.  Successful business will always have the type of environment that fosters creativity and has efficient employees.  A healthy environment doesn’t force employees to produce results, but allows them to progress and create the results themselves. The result of a healthy work environment is that employees will enjoy their work and then work harder.  This can bring the company more revenue, and hopefully the employees will see the result of their hard work in bonuses and raises.  However, money is important but it is certainly secondary – the important part is the dedication of the employees to their work and to their company.  This is the true key to success. Any employee who recognizes this description as their working environment should consider themselves fortunate.  They are allowed to grow and do better, and employees being treated fairly can be a rarity in this world.  One company that I believe adheres to this principle is Pluralsight – as evidenced by this fun video. I have blogged about it earlier. (check out my cameo at 0:37). It was great fun to work with the employees at Pluralsight while making this video.  They are a great bunch and clearly have a great work environment – we wouldn’t have had this much fun if not!  I have to tell you a little bit about making this video.  My wife shot it with her mobile phone, which was certainly a different but exciting experience!  It was hard to get the look of the video right, since I was trying to portray a body builder – this was a little outside of my own personal experience.  I have what I like to call a “healthy” body type, so trying to look extremely fit like some of the other “actors” in this video was a challenge – but I do hope that you all think I succeeded.  All in all, it was great fun to participate in this video and I hope to see my friends at Pluralsight again soon. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology, Video

    Read the article

  • Taking the fear out of a Cloud initiative through the use of security tools

    - by user736511
    Typical employees, constituents, and business owners  interact with online services at a level where their knowledge of back-end systems is low, and most of the times, there is no interest in knowing the systems' architecture.  Most application administrators, while partially responsible for these systems' upkeep, have very low interactions with them, at least at an operational, platform level.  Of greatest interest to these groups is the consistent, reliable, and manageable operation of the interfaces with which they communicate.  Introducing the "Cloud" topic in any evolving architecture automatically raises the concerns for data and identity security simply because of the perception that when owning the silicon, enterprises are not able to manage its content.  But is this really true?   In the majority of traditional architectures, data and applications that access it are physically distant from the organization that owns it.  It may reside in a shared data center, or a geographically convenient location that spans large organizations' connectivity capabilities.  In the end, very often, the model of a "traditional" architecture is fairly close to the "new" Cloud architecture.  Most notable difference is that by nature, a Cloud setup uses security as a core function, and not as a necessary add-on. Therefore, following best practices, one can say that data can be safer in the Cloud than in traditional, stove-piped environments where data access is segmented and difficult to audit. The caveat is, of course, what "best practices" consist of, and here is where Oracle's security tools are perfectly suited for the task.  Since Oracle's model is to support very large organizations, it is fundamentally concerned about distributed applications, databases etc and their security, and the related Identity Management Products, or DB Security options reflect that concept.  In the end, consumers of applications and their data are to be served more safely in a controlled Cloud environment, while realizing the many cost savings associated with it. Having very fast resources to serve them (such as the Exa* platform) makes the concept even more attractive.  Finally, if a Cloud strategy does not seem feasible, consider the pros and cons of a traditional vs. a Cloud architecture.  Using the exact same criteria and business goals/traditions, and with Oracle's technology, you might be hard pressed to justify maintaining the technical status quo on security alone. For additional information please visit Oracle's Cloud Security page at: http://www.oracle.com/us/technologies/cloud/cloud-security-428855.html

    Read the article

  • ArchBeat Link-o-Rama for August 1, 2013

    - by OTN ArchBeat
    Performance Tuning – Systems Running BPEL Processes | Ravi Saraswathi and Jaswant Sing Ravi Saraswathi and Jaswant Singh, the authors of "Oracle SOA BPEL Process Manager 11gR1 - A Hands-on Tutorial" explain performance tuning of SOA composite applications for optimal performance and scalability. Steps to configure SAML 2.0 with Weblogic Server | Puneeth The blogger known only as Punteeth shares an illustrated technical post that will be of interest to those working with Oracle WebLogic and the Security Assertion Markup Language (SAML). Video: Planning and Getting Started - Developer PCs | Chris Muir Tune in to the latest episode of ADF Architecture TV to see Chris Muir explain why you don't have to buy the most expensive PCs in order to run JDeveloper. Key User Experience Design Principles for working with Big Data | John Fuller User Experience Designer John Fuller shares 6 core design principles for working with big data that focus on "helping people bring together a variety of data types in a fast and flexible way." Event: OTN Developer Day: ADF Mobile - Burlington, MA - Aug 28 Through six sessions, including a hands-on workshop, you'll learn a simpler way to leverage your existing skills to develop enterprise mobile applications using Oracle ADF Mobile. Registration is free, but seating is limited. Optimizing WebCenter Portal Mobile Delivery | Jeevan Joseph FMW solution architect Jeevan Joseph "walks you through identifying and analyzing some common WebCenter Portal performance bottlenecks related to page weight and describes a generic approach that can streamline your portal while improving the performance and response times." Customizing specific instances of a WebCenter task flow | Jeevan Joseph Fusion Middleware A-Team solution architect Jeevan Joseph strikes again with this article that explains "how to set up parameters on MDS customization so that it is applied only under certain conditions...making it possible to customize individual instances of task flows." Exalogic Virtual Tea Break Snippets – Modifying Memory, CPU and Storage on a vServer | Andrew Hopkinson FMW solution architect Andrew Hopkinson walks you through "the simple process of resizing the resources associated with an already existing Exalogic vServer." Oracle ADF Mobile Virtual Developer Day - Next Week | Shay Shmeltzer JDeveloper product team lead Shay Schmeltzer shares agenda information for the OTN Virtual Developer Day event covering Mobile Application Development for iOS and Android, coming up one week from today, on August 7, 2013, 9am PT/Noon ET/1pm BRT. What's New In Oracle Enterprise Pack for Eclipse 12.1.2.1.0? New features and updates on the newly-released Oracle Enterprise Pack for Eclipse 12.1.2.1.0, now available for download from OTN. IOUG Cloud Builders Unite | Jeff Erickson Check out this great Oracle Magazine article by Jeff Erickson about IOUG members organizing around their common interest in building private clouds. Thought for the Day "Stuff that's hidden and murky and ambiguous is scary because you don't know what it does." — Jerry Garcia (August 1, 1942 – August 9, 1995) Source: brainyquote.com

    Read the article

  • PASS 13 Dispatches: moving to the cloud

    - by Tony Davis
    PASS Summit 13, Day 1 keynote by Quentin Clarke and we're hearing about “redefiniing mission critical in the cloud”. With a move to the Windows Azure cloud comes the promise of capacity on demand, automatic HA, backups, patching and so on, as well as passing responsibility to MS for managing hardware, upgrades and so on. However, for many databases and applications the best route to the cloud is not necessarily obvious. For most, the path of least resistance is IaaS – SQL Server in a Azure VM. It removes the hardware burden but you still have to manage your databases and implementing HA for SQL Server is your responsibility. Also, scaling up comes at quite a cost – the biggest VM (8 CPU cores, 56 GB RAM, 16 1TB drives with 500 IOPS each) weighs in at over over $4500 per month. With PaaS, in the form of Windows SQL Database, you get a “3-copies replica set” so HA comes out-of the box, and removes the majority of the administration burden, but you are moving your database into a very different environment. For a start, it's a shared environment, with other customers using the same compute nodes in the cluster, and potentially even sharing the same database (multi-tenancy). Unless you pay for SQL DB Premium edition, the resources available for your workload will depends on how nicely others “play” in the shared environment. You'll potentially need to do a lot of tuning, and application rewriting to avoid throttling issues, optimising application-database communication to deal with increased latency between the two, and so on. You'll need aggressive application caching. You'll also need retry logic and to deal with (expected) node failure and the need to reconnect. In Tuesday's PASS Summit pre-con from the SQLCAT team, they spent a lot of time covering some of the telemetric techniques (collect into Azure storage the necessary monitoring data) to perform capacity planning, work out the hotspots and bottlenecks in your cloud applications. Tools like WAD (Windows Azure Diagnostics), performance counters SQL Database DMVs, and others, will be essential. Of course, to truly exploit the vast horizontal scaling that is available from the existence of thousands of compute nodes, you'll also need to need to consider how to “shard” your data so Azure can move it between nodes at will. Finding the right path to the Cloud isn't easy, but it's coming. I spoke to people one year ago who saw no real benefit in trying to move their infrastructure and databases to the cloud, but now at their company, it's the conversation that won't go away. Tony.  

    Read the article

  • ATG Live Webcast: Planning Your Oracle E-Business Suite Upgrade from 11i to 12.1 and Beyond

    - by BillSawyer
    I am pleased to announce the next ATG Live Webcast event on December 1, 2011. Planning Your Oracle E-Business Suite Upgrade from 11i to 12.1 and Beyond Are you still on 11i and wondering about your next steps in the E-Business Suite lifecycle? Are you wondering what the upgrade considerations are going to be for 12.2? Do you want to know the best practices for upgrading E-Business Suite regardless of your version? If so, this is the webcast for you. Join Anne Carlson, Senior Director, Oracle E-Business Suite Product Strategy for this one-hour webcast with Q&A. This session will give you a framework to make informed upgrade decisions for your E-Business Suite environment. This event is targeted to functional managers, EBS project planners, and implementers. The agenda for the Planning Your Oracle E-Business Suite Upgrade from 11i to 12.1 and Beyond includes the following topics: Business Value of the Upgrade Starting Your Upgrade Project Planning Your Upgrade Approach Preparing for Your Upgrade Execution Extended Support for 11i Additional Resources Date:            Thursday, December 1, 2011Time:           8:00 AM - 9:00 AM Pacific Standard TimePresenter:  Anne Carlson, Senior Director, Oracle E-Business Suite Product StrategyWebcast Registration Link (Preregistration is optional but encouraged)To hear the audio feed:    Domestic Participant Dial-In Number:           877-697-8128    International Participant Dial-In Number:      706-634-9568    Additional International Dial-In Numbers Link:    Dial-In Passcode:                                              98515To see the presentation:    The Direct Access Web Conference details are:    Website URL: https://ouweb.webex.com    Meeting Number:  271378459 If you miss the webcast, or you have missed any webcast, don't worry -- we'll post links to the recording as soon as it's available from Oracle University.  You can monitor this blog for pointers to the replay. And, you can find our archive of our past webcasts and training at http://blogs.oracle.com/stevenChan/entry/e_business_suite_technology_learningIf you have any questions or comments, feel free to email Bill Sawyer (Senior Manager, Applications Technology Curriculum) at BilldotSawyer-AT-Oracle-DOT-com.

    Read the article

  • Oracle Unveils Oracle Fusion Tap for the iPad

    - by Richard Lefebvre
    Oracle Fusion Tap: Productivity Amplified Anywhere, Anytime Oracle today announced the availability of Oracle Fusion Tap, a native iPad application that redefines the level of productivity users can achieve while on-the-go.   Oracle Fusion Tap runs off cloud-based enterprise applications and across Oracle Application Cloud Services, requiring only one simple Apple App Store installation.   Automatically personalized to each user, Oracle Fusion Tap gives users exactly what they need at their fingertips and provides the long-sought, key functionalities to remain productive and to keep business moving, even when away from the desk.   Designed specifically for the iPad and the mobile workforce, Oracle Fusion Tap provides access with or without an Internet connection.   By grouping functional capabilities into three core areas of "connect," "analyze," and "work," users can easily and directly connect with what they need in the app, complete activities, and move on.   As organizations strive for a lean and agile workforce, Oracle Fusion Tap helps users find and make connections with the right people at the right time, obtaining answers to questions quickly and removing roadblocks faster.   Oracle Fusion Tap also provides users with secure access to actionable performance indicators and day-to-day management of their workforce and sales force automation. Supporting Quotes "Both the enterprise and technology providers must recognize the need to innovate and adapt for the increasing mobility of the workforce—not just for sales teams, but across the organization," said Carter Lusher, Research Fellow and Chief Analyst of Enterprise Applications Ecosystem, Ovum. "A mobile application that quickly and powerfully allows employees to make connections, analyze data, and complete activities at any time and wherever they may be located drives new levels of business value and enhances efficiency. Frankly, mobile access is no longer a 'nice to have' but a 'must have.'"   "The mobile workforce is a business reality, and Oracle Fusion Tap is an example of how Oracle delivers mobile and cloud innovations that fundamentally improve productivity and how we work," said Chris Leone, Senior Vice President of Application Development, Oracle. "With Oracle Fusion Tap users will have an all-in-one, easily extensible app that puts mission-critical data and colleague connection at their fingertips." Supporting Resources Oracle Fusion Tap Oracle Fusion Tap on App Store Oracle Fusion Tap YouTube Video Oracle CRM on Social Media @OracleCRM OracleCRM on Facebook OracleCRM on YouTube

    Read the article

  • SQLAuthority News – Presented Soft Skill Session on Presentation Skills at SQL Bangalore on May 3, 2014

    - by Pinal Dave
    I have presented on various database technologies for almost 10 years now. SQL, Database and NoSQL have been part of my life. Earlier this month, I had the opportunity to present on the topic Performing an Effective Presentation. I must say it was blast to prepare as well as present this session. This event was part of the SQL Bangalore community. If you are in Bangalore, you must be part of this group. SQL Bangalore is a wonderful community and we always have a great response when we present on technology. It is SQL User Group and we discuss everything SQL there. This month we had SQL Server 2014 theme and we had a community launch of SQL Server. We have the best of the best speakers presenting on SQL Server 2014 technology. The event had amazing speakers and each of them did justice to the subject. You can read about this over here. In this session I told a story from my life. I talked about who inspired me and how I learned to speak in public. I told stories about two legends  who have inspired me. There is no video recording of this session. If you want to get resources from this session, please sign up my newsletter at http://bit.ly/sqllearn. Well, I had a great time at this event. We had over 250 people showed up at this event and had a grand  time together. I personally enjoyed a session of Amit Benerjee, Balmukund Lakhani and Vinod Kumar. Ken and Surabh also entertained the audience. Overall, this was a grand event and if you were in Bangalore and did not make it to this event. You did miss out on a few things. Here are a few photos of this event. SQL Bangalore UG Nupur, Chandra, Shaivi, Balmukund, Amit, Vinod [captions This] SQL Bangalore UG Audience Pinal Dave presenting at SQL UG in Bangalore Here are few of the slides from this presentation: Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Author Visit, SQLAuthority News, T SQL

    Read the article

  • Enhance Your Browsing with Hyperwords in Firefox

    - by Asian Angel
    While browsing it is easy to find information that you would like to know more about, convert, or translate. The Hyperwords extension provides access to these types of resources and more in Firefox. Using Hyperwords Once the extension has finished installing you will be presented with a demo video that will let you learn more about how Hyperwords works. For our first example we chose to look for more information concerning “WASP-12b” using Wikipedia. Notice the small bluish circle on the lower right of the highlighted term…it is the default access for the Hyperwords menu (access by hovering mouse over it). If you hover over the the Wikipedia (or other) link you can access the information in a scrollable popup window. Or if you prefer click on the link to view the information in a new tab. Choose the style that best suits your needs. Hyperwords is extremely useful for quick unit conversions. Suppose you want to share a news story that you have found while browsing. Highlight the title, access Hyperwords, and choose your preferred sharing source. You may need to authorize access for Hyperwords to post to your account. Once you have authorized access you can start sharing those links very easily. This is just a small sampling of Hyperwords many useful features. Preferences Hyperwords has a nice set of preferences available to help you customize it. Alter the menu popup style, add or remove menu entries, and modify other functions for Hyperwords. Conclusion Hyperwords makes a nice addition to Firefox for anyone needing quick access to search, reference, translation, and other services while browsing. Links Download the Hyperwords extension (Mozilla Add-ons) Download the Hyperwords extension (Extension Homepage) Similar Articles Productive Geek Tips Switch to Private Browsing Mode Easily with Toggle Private BrowsingPreview Tabs in Firefox with Tab Preview 0.3You Really Want to Completely Disable Tabs in Firefox?Quick Hits: 11 Firefox Tab How-TosWhen to Use Protect Tab vs Lock Tab in Firefox TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Xobni Plus for Outlook All My Movies 5.9 CloudBerry Online Backup 1.5 for Windows Home Server Snagit 10 Outlook Tools, one stop tweaking for any Outlook version Zoofs, find the most popular tweeted YouTube videos Video preview of new Windows Live Essentials 21 Cursor Packs for XP, Vista & 7 Map the Stars with Stellarium Use ILovePDF To Split and Merge PDF Files

    Read the article

  • links for 2011-03-08

    - by Bob Rhubart
    The Empowered Business "Someone needs to be the enterprise parent that asks the question, “do you really need that?” It may be a shiny new thing, but does it make a difference in the ability to accomplish the strategy and goals?" - Enterprise Architect Todd Biske (tags: enterprisearchitecture) Knowledge Workers in the British Raj "While we’ve used technology to change business, business has also evolved to the point that it’s changing how we think about and use technology." - Peter Evans Greenwood (tags: enterprisearchitecture enterprise2.0) Arun Gupta, Miles to go ...: OTN Developer Day Boston 2011 - Slides & Trip Report Arun Gupta shares slides from his Developer Day presentations. (tags: oracle otn java) Use WLST to Delete All JMS Messages From a Destination (James Bayer's Blog) James Bayer responds to a question. (tags: oracle otn weblogic jms) Triangle Circle Square: Apex in the Amazon Cloud Scott Wesley shares several links to resources covering Oracle Apex on an Amazon EC2 instance. (tags: oracle apex ec2 amazon cloud) William Vambenepe: Reading IBM's proposed standard for Cloud Architecture The always entertaining William Vambenepe gives IBM's proposed Cloud standards the full Ebert. (tags: oracle cloud ibm standards) Government Information Group Cloud Computing Research Study "The twin pressures of reduced budgets and the need for greater efficiency have led the federal government to strongly promote cloud computing as a solution whenever possible." (tags: cloudcomputing cloud) The Ron Batra Blog: Technology Whispers: Top 10 Reasons to go ExaData "Continuing my exploration of ExaData, I thought I'd take a minute to consolidate my thoughts into key reasons for which Oracle ExaData could be a good fit for your needs." - Oracle ACE Director Ron Batra (tags: oracle oracleace exadata) Oracle WebCenter: Composite Applications & Mash-Ups (Oracle Enterprise 2.0 Blog) "The new Business Mash-up editor allows business users to take any Oracle Application or 3rd party application and wire the backend data sources or APIs to a rich set of visualizations and reuse them in mashups." (tags: oracle webcenter enterprise2.0) Antonio Romero: Great Discussion of ETL and ELT Tooling in TDWI Linkedin Group Antonio says: "There’s a great discussion of ETL and ELT tooling going on in the official TDWI Linkedin group, under the heading 'How Sustainable is SQL for ETL?' It delves into a wide range of topics." (tags: oracle linkedin etl elt) YouTube - Bunny Inc. - Episode 1. Mr. CIO meets Mr. Executive Manager Yes, it's a commercial. But it's well done and it's funny. (tags: e20 enterprise2.0 webcenter) Markus Eisele: Both Weblogic and Glassfish are strategic products for Oracle Oracle ACE Director Markus Eisele shares selected quotes pulled from the recent TechCast Live interview with Oracle's Anil Gaur and Adam Leftik (tags: oracle java weblogic glassfish) How to become an Oracle SOA expert? (SOA Partner Community Blog) Jurgan Kress shares info and links for those interested in capitalizing on SOA. (tags: oracle soa)

    Read the article

  • Partner Webcast – More out of ODA with DB Options - 19 July 2012

    - by Thanos
    The Simple, Reliable, Affordable Path to High-Availability Databases Critical business data needs to be available 24/7 for users and customers, but it can be a struggle to find the time and resources to build a highly available database system that’s reliable and affordable. That’s why Oracle created the new Oracle Database Appliance—a complete package of software, server, storage, and networking. The Oracle Database Appliance integrates the world’s most popular database - Oracle Database 11g  - with system software, servers, storage and networking in a single box. Business gets the benefit of a reliable, secure and highly available database to support applications and maintain continuity – as well as groundbreaking ease of use. But that is not all, with the support for all Oracle Database Options, Oracle Database Appliance can be the ideal solution for many use cases. The benefits?   Unmatched performance, reliability & security for your data that’s there when you need it – which is all the time. Fast installation, simple deployment, easy management. Out of the box. Significant cost savings & reduced risk and complexity compared to integrating all the elements yourself. Ongoing lower total cost of ownership with multiple automated support, detection & correction functions that also save you time.   Discover the Oracle Database Appliance Value Proposition and learn how to position and combine it with database options to capture new business and easily roll out solutions safely and with maximum cost efficiency. Agenda: Oracle Database& Engineered Systems Innovation. What’s the Oracle Database Appliance ? Oracle Database Appliance Value Proposition. Oracle Database Appliance with Database Options Oracle Database Appliance Partners Business Delivery Format This FREE online LIVE eSeminar will be delivered over the Web. Registrations received less than 24hours prior to start time may not receive confirmation to attend. Duration: 1 hour Register Now! For any questions please contact us at partner.imc-AT-beehiveonline.oracle-DOT-com Visit regularly our ISV Migration Center blog Or Follow us @oracleimc to learn more on Oracle Technologies as well as upcoming partner webcasts and events.

    Read the article

  • Towards Database Continuous Delivery – What Next after Continuous Integration? A Checklist

    - by Ben Rees
    .dbd-banner p{ font-size:0.75em; padding:0 0 10px; margin:0 } .dbd-banner p span{ color:#675C6D; } .dbd-banner p:last-child{ padding:0; } @media ALL and (max-width:640px){ .dbd-banner{ background:#f0f0f0; padding:5px; color:#333; margin-top: 5px; } } -- Database delivery patterns & practices STAGE 4 AUTOMATED DEPLOYMENT If you’ve been fortunate enough to get to the stage where you’ve implemented some sort of continuous integration process for your database updates, then hopefully you’re seeing the benefits of that investment – constant feedback on changes your devs are making, advanced warning of data loss (prior to the production release on Saturday night!), a nice suite of automated tests to check business logic, so you know it’s going to work when it goes live, and so on. But what next? What can you do to improve your delivery process further, moving towards a full continuous delivery process for your database? In this article I describe some of the issues you might need to tackle on the next stage of this journey, and how to plan to overcome those obstacles before they appear. Our Database Delivery Learning Program consists of four stages, really three – source controlling a database, running continuous integration processes, then how to set up automated deployment (the middle stage is split in two – basic and advanced continuous integration, making four stages in total). If you’ve managed to work through the first three of these stages – source control, basic, then advanced CI, then you should have a solid change management process set up where, every time one of your team checks in a change to your database (whether schema or static reference data), this change gets fully tested automatically by your CI server. But this is only part of the story. Great, we know that our updates work, that the upgrade process works, that the upgrade isn’t going to wipe our 4Tb of production data with a single DROP TABLE. But – how do you get this (fully tested) release live? Continuous delivery means being always ready to release your software at any point in time. There’s a significant gap between your latest version being tested, and it being easily releasable. Just a quick note on terminology – there’s a nice piece here from Atlassian on the difference between continuous integration, continuous delivery and continuous deployment. This piece also gives a nice description of the benefits of continuous delivery. These benefits have been summed up by Jez Humble at Thoughtworks as: “Continuous delivery is a set of principles and practices to reduce the cost, time, and risk of delivering incremental changes to users” There’s another really useful piece here on Simple-Talk about the need for continuous delivery and how it applies to the database written by Phil Factor – specifically the extra needs and complexities of implementing a full CD solution for the database (compared to just implementing CD for, say, a web app). So, hopefully you’re convinced of moving on the the next stage! The next step after CI is to get some sort of automated deployment (or “release management”) process set up. But what should I do next? What do I need to plan and think about for getting my automated database deployment process set up? Can’t I just install one of the many release management tools available and hey presto, I’m ready! If only it were that simple. Below I list some of the areas that it’s worth spending a little time on, where a little planning and prep could go a long way. It’s also worth pointing out, that this should really be an evolving process. Depending on your starting point of course, it can be a long journey from your current setup to a full continuous delivery pipeline. If you’ve got a CI mechanism in place, you’re certainly a long way down that path. Nevertheless, we’d recommend evolving your process incrementally. Pages 157 and 129-141 of the book on Continuous Delivery (by Jez Humble and Dave Farley) have some great guidance on building up a pipeline incrementally: http://www.amazon.com/Continuous-Delivery-Deployment-Automation-Addison-Wesley/dp/0321601912 For now, in this post, we’ll look at the following areas for your checklist: You and Your Team Environments The Deployment Process Rollback and Recovery Development Practices You and Your Team It’s a cliché in the DevOps community that “It’s not all about processes and tools, really it’s all about a culture”. As stated in this DevOps report from Puppet Labs: “DevOps processes and tooling contribute to high performance, but these practices alone aren’t enough to achieve organizational success. The most common barriers to DevOps adoption are cultural: lack of manager or team buy-in, or the value of DevOps isn’t understood outside of a specific group”. Like most clichés, there’s truth in there – if you want to set up a database continuous delivery process, you need to get your boss, your department, your company (if relevant) onside. Why? Because it’s an investment with the benefits coming way down the line. But the benefits are huge – for HP, in the book A Practical Approach to Large-Scale Agile Development: How HP Transformed LaserJet FutureSmart Firmware, these are summarized as: -2008 to present: overall development costs reduced by 40% -Number of programs under development increased by 140% -Development costs per program down 78% -Firmware resources now driving innovation increased by a factor of 8 (from 5% working on new features to 40% But what does this mean? It means that, when moving to the next stage, to make that extra investment in automating your deployment process, it helps a lot if everyone is convinced that this is a good thing. That they understand the benefits of automated deployment and are willing to make the effort to transform to a new way of working. Incidentally, if you’re ever struggling to convince someone of the value I’d strongly recommend just buying them a copy of this book – a great read, and a very practical guide to how it can really work at a large org. I’ve spoken to many customers who have implemented database CI who describe their deployment process as “The point where automation breaks down. Up to that point, the CI process runs, untouched by human hand, but as soon as that’s finished we revert to manual.” This deployment process can involve, for example, a DBA manually comparing an environment (say, QA) to production, creating the upgrade scripts, reading through them, checking them against an Excel document emailed to him/her the night before, turning to page 29 in his/her notebook to double-check how replication is switched off and on for deployments, and so on and so on. Painful, error-prone and lengthy. But the point is, if this is something like your deployment process, telling your DBA “We’re changing everything you do and your toolset next week, to automate most of your role – that’s okay isn’t it?” isn’t likely to go down well. There’s some work here to bring him/her onside – to explain what you’re doing, why there will still be control of the deployment process and so on. Or of course, if you’re the DBA looking after this process, you have to do a similar job in reverse. You may have researched and worked out how you’d like to change your methodology to start automating your painful release process, but do the dev team know this? What if they have to start producing different artifacts for you? Will they be happy with this? Worth talking to them, to find out. As well as talking to your DBA/dev team, the other group to get involved before implementation is your manager. And possibly your manager’s manager too. As mentioned, unless there’s buy-in “from the top”, you’re going to hit problems when the implementation starts to get rocky (and what tool/process implementations don’t get rocky?!). You need to have support from someone senior in your organisation – someone you can turn to when you need help with a delayed implementation, lack of resources or lack of progress. Actions: Get your DBA involved (or whoever looks after live deployments) and discuss what you’re planning to do or, if you’re the DBA yourself, get the dev team up-to-speed with your plans, Get your boss involved too and make sure he/she is bought in to the investment. Environments Where are you going to deploy to? And really this question is – what environments do you want set up for your deployment pipeline? Assume everyone has “Production”, but do you have a QA environment? Dedicated development environments for each dev? Proper pre-production? I’ve seen every setup under the sun, and there is often a big difference between “What we want, to do continuous delivery properly” and “What we’re currently stuck with”. Some of these differences are: What we want What we’ve got Each developer with their own dedicated database environment A single shared “development” environment, used by everyone at once An Integration box used to test the integration of all check-ins via the CI process, along with a full suite of unit-tests running on that machine In fact if you have a CI process running, you’re likely to have some sort of integration server running (even if you don’t call it that!). Whether you have a full suite of unit tests running is a different question… Separate QA environment used explicitly for manual testing prior to release “We just test on the dev environments, or maybe pre-production” A proper pre-production (or “staging”) box that matches production as closely as possible Hopefully a pre-production box of some sort. But does it match production closely!? A production environment reproducible from source control A production box which has drifted significantly from anything in source control The big question is – how much time and effort are you going to invest in fixing these issues? In reality this just involves figuring out which new databases you’re going to create and where they’ll be hosted – VMs? Cloud-based? What about size/data issues – what data are you going to include on dev environments? Does it need to be masked to protect access to production data? And often the amount of work here really depends on whether you’re working on a new, greenfield project, or trying to update an existing, brownfield application. There’s a world if difference between starting from scratch with 4 or 5 clean environments (reproducible from source control of course!), and trying to re-purpose and tweak a set of existing databases, with all of their surrounding processes and quirks. But for a proper release management process, ideally you have: Dedicated development databases, An Integration server used for testing continuous integration and running unit tests. [NB: This is the point at which deployments are automatic, without human intervention. Each deployment after this point is a one-click (but human) action], QA – QA engineers use a one-click deployment process to automatically* deploy chosen releases to QA for testing, Pre-production. The environment you use to test the production release process, Production. * A note on the use of the word “automatic” – when carrying out automated deployments this does not mean that the deployment is happening without human intervention (i.e. that something is just deploying over and over again). It means that the process of carrying out the deployment is automatic in that it’s not a person manually running through a checklist or set of actions. The deployment still requires a single-click from a user. Actions: Get your environments set up and ready, Set access permissions appropriately, Make sure everyone understands what the environments will be used for (it’s not a “free-for-all” with all environments to be accessed, played with and changed by development). The Deployment Process As described earlier, most existing database deployment processes are pretty manual. The following is a description of a process we hear very often when we ask customers “How do your database changes get live? How does your manual process work?” Check pre-production matches production (use a schema compare tool, like SQL Compare). Sometimes done by taking a backup from production and restoring in to pre-prod, Again, use a schema compare tool to find the differences between the latest version of the database ready to go live (i.e. what the team have been developing). This generates a script, User (generally, the DBA), reviews the script. This often involves manually checking updates against a spreadsheet or similar, Run the script on pre-production, and check there are no errors (i.e. it upgrades pre-production to what you hoped), If all working, run the script on production.* * this assumes there’s no problem with production drifting away from pre-production in the interim time period (i.e. someone has hacked something in to the production box without going through the proper change management process). This difference could undermine the validity of your pre-production deployment test. Red Gate is currently working on a free tool to detect this problem – sign up here at www.sqllighthouse.com, if you’re interested in testing early versions. There are several variations on this process – some better, some much worse! How do you automate this? In particular, step 3 – surely you can’t automate a DBA checking through a script, that everything is in order!? The key point here is to plan what you want in your new deployment process. There are so many options. At one extreme, pure continuous deployment – whenever a dev checks something in to source control, the CI process runs (including extensive and thorough testing!), before the deployment process keys in and automatically deploys that change to the live box. Not for the faint hearted – and really not something we recommend. At the other extreme, you might be more comfortable with a semi-automated process – the pre-production/production matching process is automated (with an error thrown if these environments don’t match), followed by a manual intervention, allowing for script approval by the DBA. One he/she clicks “Okay, I’m happy for that to go live”, the latter stages automatically take the script through to live. And anything in between of course – and other variations. But we’d strongly recommended sitting down with a whiteboard and your team, and spending a couple of hours mapping out “What do we do now?”, “What do we actually want?”, “What will satisfy our needs for continuous delivery, but still maintaining some sort of continuous control over the process?” NB: Most of what we’re discussing here is about production deployments. It’s important to note that you will also need to map out a deployment process for earlier environments (for example QA). However, these are likely to be less onerous, and many customers opt for a much more automated process for these boxes. Actions: Sit down with your team and a whiteboard, and draw out the answers to the questions above for your production deployments – “What do we do now?”, “What do we actually want?”, “What will satisfy our needs for continuous delivery, but still maintaining some sort of continuous control over the process?” Repeat for earlier environments (QA and so on). Rollback and Recovery If only every deployment went according to plan! Unfortunately they don’t – and when things go wrong, you need a rollback or recovery plan for what you’re going to do in that situation. Once you move in to a more automated database deployment process, you’re far more likely to be deploying more frequently than before. No longer once every 6 months, maybe now once per week, or even daily. Hence the need for a quick rollback or recovery process becomes paramount, and should be planned for. NB: These are mainly scenarios for handling rollbacks after the transaction has been committed. If a failure is detected during the transaction, the whole transaction can just be rolled back, no problem. There are various options, which we’ll explore in subsequent articles, things like: Immediately restore from backup, Have a pre-tested rollback script (remembering that really this is a “roll-forward” script – there’s not really such a thing as a rollback script for a database!) Have fallback environments – for example, using a blue-green deployment pattern. Different options have pros and cons – some are easier to set up, some require more investment in infrastructure; and of course some work better than others (the key issue with using backups, is loss of the interim transaction data that has been added between the failed deployment and the restore). The best mechanism will be primarily dependent on how your application works and how much you need a cast-iron failsafe mechanism. Actions: Work out an appropriate rollback strategy based on how your application and business works, your appetite for investment and requirements for a completely failsafe process. Development Practices This is perhaps the more difficult area for people to tackle. The process by which you can deploy database updates is actually intrinsically linked with the patterns and practices used to develop that database and linked application. So you need to decide whether you want to implement some changes to the way your developers actually develop the database (particularly schema changes) to make the deployment process easier. A good example is the pattern “Branch by abstraction”. Explained nicely here, by Martin Fowler, this is a process that can be used to make significant database changes (e.g. splitting a table) in a step-wise manner so that you can always roll back, without data loss – by making incremental updates to the database backward compatible. Slides 103-108 of the following slidedeck, from Niek Bartholomeus explain the process: https://speakerdeck.com/niekbartho/orchestration-in-meatspace As these slides show, by making a significant schema change in multiple steps – where each step can be rolled back without any loss of new data – this affords the release team the opportunity to have zero-downtime deployments with considerably less stress (because if an increment goes wrong, they can roll back easily). There are plenty more great patterns that can be implemented – the book Refactoring Databases, by Scott Ambler and Pramod Sadalage is a great read, if this is a direction you want to go in: http://www.amazon.com/Refactoring-Databases-Evolutionary-paperback-Addison-Wesley/dp/0321774515 But the question is – how much of this investment are you willing to make? How often are you making significant schema changes that would require these best practices? Again, there’s a difference here between migrating old projects and starting afresh – with the latter it’s much easier to instigate best practice from the start. Actions: For your business, work out how far down the path you want to go, amending your database development patterns to “best practice”. It’s a trade-off between implementing quality processes, and the necessity to do so (depending on how often you make complex changes). Socialise these changes with your development group. No-one likes having “best practice” changes imposed on them, so good to introduce these ideas and the rationale behind them early.   Summary The next stages of implementing a continuous delivery pipeline for your database changes (once you have CI up and running) require a little pre-planning, if you want to get the most out of the work, and for the implementation to go smoothly. We’ve covered some of the checklist of areas to consider – mainly in the areas of “Getting the team ready for the changes that are coming” and “Planning our your pipeline, environments, patterns and practices for development”, though there will be more detail, depending on where you’re coming from – and where you want to get to. This article is part of our database delivery patterns & practices series on Simple Talk. Find more articles for version control, automated testing, continuous integration & deployment.

    Read the article

< Previous Page | 226 227 228 229 230 231 232 233 234 235 236 237  | Next Page >