Search Results

Search found 10941 results on 438 pages for 'location'.

Page 173/438 | < Previous Page | 169 170 171 172 173 174 175 176 177 178 179 180  | Next Page >

  • Learn how Oracle storage efficiencies can help your budget

    - by jenny.gelhausen
    Mark Your Calendar! Live Webcast: Next Generation Storage Management Solutions Wednesday, March 24th, 2010 at 9:00am PT or your local time Please plan to join us for this webcast where Forrester senior analyst Andrew Reichman will discuss the pillars of storage efficiency, how to measure and improve it, and how this can help your business immediately alleviate budget pressures. Joining Mr. Reichman are Phil Stephenson, Senior Principal Product Manager at Oracle, and Matthew Baier, Oracle Product Director, who will explain to you the next generation storage capabilities available in Oracle Database 11g and Oracle Exadata. Register for this March 24th live wecast today! var gaJsHost = (("https:" == document.location.protocol) ? "https://ssl." : "http://www."); document.write(unescape("%3Cscript src='" + gaJsHost + "google-analytics.com/ga.js' type='text/javascript'%3E%3C/script%3E")); try { var pageTracker = _gat._getTracker("UA-13185312-1"); pageTracker._trackPageview(); } catch(err) {}

    Read the article

  • Time based movement Vs Frame rate based movement?

    - by sil3nt
    Hello there, I'm new to Game programmming and SDL, and I have been following Lazyfoo's SDL tutorials. My question is related to time based motion and frame rate based motion, basically which is better or appropriate depending on situations?. Could you give me an example where each of these methods are used?. Another question I have is that, in lazyfoo's two Motion tutorials (FPS based and time based) The time based method showed a much smoother animation while the Frame rate based one was a little hiccupy, meaning you could clearly see the gap between the previous location of the dot and its current position when you compare the two programs. As beginner which method should I stick to?(all I want is smooth animations).

    Read the article

  • Unable to run Tor even in terminal, Vidalia exit code 127

    - by Ubuntu Newb
    I'm using Ubuntu 12.10 (quantal) and I recently installed Tor (32 bits) following the instructions on the Tor project's page. Then I started the script after having it extracted from the console and got this: master@ubuntu:~/tor-browser_en-US$ ./start-tor-browser Launching Tor Browser Bundle for Linux in /home/master/tor-browser_en-US ./start-tor-browser: 225: ./start-tor-browser: ./App/vidalia: not found Vidalia exited abnormally. Exit code: 127 Then I ran Vidalia from the console and: master@ubuntu:~/tor-browser_en-US$ vidalia (<unknown>:11354): IBUS-WARNING **: Unable to load /var/lib/dbus/machine-id: Failed to open file '/var/lib/dbus/machine-id': Permission denied master@ubuntu:~/tor-browser_en-US$ vidalia (<unknown>:11358): IBUS-WARNING **: Unable to load /var/lib/dbus/machine-id: Failed to open file '/var/lib/dbus/machine-id': Permission denied And after Vidalia's GUI opens I get the error prompt about starting Tor: "Vidalia was unable to start Tor. Check your settings to ensure the correct name and location of your Tor executable is specified." How can I start Tor?

    Read the article

  • Greetings!!!!

    - by [email protected]
    Greetings everyone!If you're reading this, hopefully it's because you have been following our series of webcasts on Oracle 11gR2 that we've been hosting on Wordpress. If you found us some other way, well that's even better - the more the merrier as they say.In either case, welcome to our new blog!!! Over the next few days, Ill move the old posts from wordpress to here its all in the one location.Right! Who are we? The authors of this blog are the ANZ Inside Consulting Team.Currently, this is made of of:Tom JurcicYasin MohammedAndrew ClarkeRene Poels and me - Alex BlythBasically, our role in Oracle is to help users of our technologies get the most of their existing investments as well as what's new, old, blue, what have you...Ideally, this is all going to be technical in nature and not of a marketing nature (we'll leave the marketing up to others).For now, there's obviously not much here. But that won't last too long. In the mean time, those who are interested can find replays and slides of our previous webcasts on the "Oracle 11g Webcasts" page.Till next timeAlex

    Read the article

  • Mark your calendar : Oracle Week, Nov 18-22, Herzliya

    - by Frederic Pariente
    The local ISV Engineering will be participating at the Israel Oracle Week on Nov 18-22, come meet us there! MARK YOUR CALENDAR Oracle Week Israel Date : November 18-22, 2012 Time : 09:00-16:30 Location :  Daniel HotelHerzliyaIsrael Tracks : DatabaseMiddlewareDevelopment InfrastructureBusiness ApplicationsBig Data ManagementSOA & BPMBI JavaITCloud  Here is a sample list of the Solaris 11 sessions to date, make sure to register for these. Number Name Date Track 12224 Optimizing Enterprise Applications with Oracle Solaris 11 19/11/2012 Infrastructure 12327 Oracle Solaris 11: Engineered Cloud Security with Wire-Speed Encryption and Delegated Admin 20/11/2012 Infrastructure, Cloud 12425 Simplified Lifecycle Management in Oracle Solaris 11 with AI, IPS and Ops Center 21/11/2012 Infrastructure 12528 Oracle Solaris 11 Administration: Zone, Resource Management and System Security 22/11/2012 Infrastructure 12127 Built for Cloud: Virtualization Use Cases and Technologies in Oracle Solaris 11 18/11/2012 Infrastructure, Cloud See you there!

    Read the article

  • Trying to find video of a talk on the impact of memory access latency

    - by user12889
    Some months ago I stumbled across a video on the internet of somebody giving a very good talk on the impact of memory access latency on the execution of programs. I'm trying to find the video again; maybe you know what video I mean and were I can find it. This is what I remember about the talk/video: I don't remember the title and it may have been broader, but the talk was a lot about impact of memory access latency in modern processors on program execution. The talk was in English and most likely the location was in America. The speaker was very knowledgeable about the topic, but the talk was in an informal setting (not a conference presentation or university lecture). I think the speaker was known to the audience and may even have been famous (I don't remember) The audience may have been a computer club / group of a local community or company (but I don't remember for sure)

    Read the article

  • UKOUG Application Server & Middleware SIG Meeting

    - by JuergenKress
    Date: Wednesday 10th Oct 2012 Time: 09:00 - 16:00 Location: Reading Venue: Oracle, Thames Valley Park, Reading Agenda: 09:00 Registration and Coffee 10:00 Welcome Application Server & Middleware Committee 10:10 Oracle Support Updates Nick Pounder, Oracle Customer Services 10:30 OpenWorld 2012 - News Round-up for Middleware Admins Simon Haslam, Veriton Limited 11:00 Coffee break 11:20 Oracle Single-Sign on to Oracle Access Manager Migration Rob Otto, Oracle Consulting Services UK 12:05 Supporting Fusion Middleware through First Failure Capture (theory) Greg Cook, Oracle 12:50 Lunch and Network 13:35 Deputy Chair Elections UKOUG 13:45 Supporting Fusion Middleware through First Failure Capture (demos) Greg Cook, Oracle 14:15 Networking session including tea/coffee 14:45 Real Life WebLogic Performance Tuning: Tales and Techniques from the Field Steve Millidge, C2B2 Consulting Limited 15:30 WLST: WebLogic's Swiss Army Knife Simon Haslam, Veriton Limited 15:45 AOB and Close For details please visit the registration page. WebLogic Partner Community For regular information become a member in the WebLogic Partner Community please visit: http://www.oracle.com/partners/goto/wls-emea ( OPN account required). If you need support with your account please contact the Oracle Partner Business Center. BlogTwitterLinkedInMixForumWiki Technorati Tags: UK user group,Simon Haslam,WebLogic Community,Oracle,OPN,Jürgen Kress

    Read the article

  • aspnet_regiis -lk is not listing the site I need

    - by Luke Duddridge
    I am trying to release a site to run under framework 4 on a server that also hosts framework 2 sites. By default the App has defaulted to framework 2, but when I try to change it's framework to 4 I get a message saying that the following action will cause the iis to reset. The problem I have is there are serveral active sites that I do not want to interupt with a restart. The message goes on to say you can avoid restarting by running the following: aspnet_regiis -norestart -s [IIS Virtual Path] I have been attempting to find the site virtual path but when I run aspnet_regiis -lk the site I am after does not appear to be listed. My first thoughts were that it has something to do with the app pool?, but I'm sure I saw sites that are inactive listed, and after creating a basic site to get it to run under framework 2, the site still did not appear in the -lk list. Can anyone tell me if there is an alternative location to the -lk that I can find the specific information realating to the IIS Virtual Path?

    Read the article

  • Best SEO practices for mobile URLs: 301, rel=canonical, or something else?

    - by Chris
    I am developing a site with a mobile version and am trying to figure the appropriate way to manage the URLs for search engines. So far I've considered: Having a mobile site with rel="canonical" links to the regular site. Putting both the mobile site and full site on one URL, and doing user agent sniffing. Another opinion: Spencer: "If you have a mobile site at a separate location or URL, you should 301 redirect each and every mobile page to its corresponding page on your main website. Employ user agent detection so that the mobile optimized version is served up if someone's coming in from a hand-held. - http://developer.practicalecommerce.com/articles/1722-Mobile-site-Development-Best-Practices-for-SEO-Usability Both 2 and 3 make it hard for a user who wants to switch to the full site or mobile site manually, but I'm not sure 1 is the best alternative. What's the best way to write URLs for a mobile site?

    Read the article

  • Ipod won't mount on Banshee, causes it to crash

    - by newtonwp
    Since updating to Ubuntu 12.10 I can't put Music on my iPod anymore, Banshee does not recognize it and crashes after about 10 seconds. I was going to post the output of 'banshee' in the Terminal, but my whole Laptop is messing up now, it is reluctant to open any application right now. Anyway, my iPod is running ios 4.2 and it has been like that for quite some time. Never been a problem before. I could really use some advice here. Edit: And when I unplug the iPod and put it back in again, I get three error messages: 1) Unable to Open a Folder for Documents on Ipod Cache invalid, retry (internally handled) 2)Unable to open a folder for iPod Timeout was reached 3)Unable to mount iPod Location is already mounted. Nothing working atm.

    Read the article

  • Prevent Changing the Screen Saver and Wallpaper in Windows 7

    - by Mysticgeek
    Sometimes you might not want users to have the ability to change Screen Savers and Wallpaper on Windows 7 workstations. Today we look at how to prevent them from changing either one or both. You might administer computers in your home or small office and find it annoying when users continuously change the wallpaper and Screen Savers to something obnoxious. A lot of times they might be inexperienced users and download these so-called “wonderful and free” Screen Saver/Wallpaper packages from shady sites that include loads of Spyware. Preventing users from changing them is another helpful tool to avoid wasteful time spent switching things back. Prevent Changing Screensavers & Wallpaper Using Group Policy Editor  Note: This method uses Group Policy which is not available in Home versions on Windows 7. Open the Start Menu and enter gpedit.msc into the Search box and hit Enter. When Local Group Policy Editor opens, navigate to User Configuration \ Administrative Templates \ Control Panel \ Personalization. Then in the right column double-click on Prevent changing desktop background. Now check the radio button next to Enabled, then click OK. Back on the Group Policy Screen, double-click on Prevent changing screen saver. In the next screen select the radio button next to Enable, click OK, then close out of Group Policy Editor. Now when a user goes into the Personalization section, the Desktop Background hyperlink is now grayed out and inactive. Notice the message One or more of the settings on this page has been disabled by the system administrator at the bottom of the section. If they click to change the Screen Saver, an error message will pop up letting them know the function is disabled. Prevent Changing Screensavers & Wallpaper Using a Registry Hack You can also make a couple Registry changes to prevent users from changing the Wallpaper & Screen Saver…which will work on Home versions of Windows 7. Before making any Registry changes make sure you back it up first. Open the Registry by typing regedit into the Search box in the Start menu and hit Enter. First we’ll start with the Wallpaper. Navigate to HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\Policies\System and create a new String Value and name it Wallpaper. Then modify the Value data to point to the location of the Wallpaper you want it to always be. Where in this example it’s our main wallpaper on our local drive…then click OK. Now let’s make sure they can’t change the Screen Saver. In the same Registry location, we need to make a new DWORD (32-bit) Value. Give it the Value name of NoDispScrSavPage and the value data of “1” and click OK. Close out of the Registry and restart the machine or simply log off then back on again for the changes to take effect. Results For the Wallpapers, a user can still go in and see the selections, however if they try to change it to something else… It will just go back to the Personalization screen and no changes will be made, as we set the value to only be the background we specified. If the user tries to make a change to the Screen Saver, the hyperlink will be grayed out and inactive, and the message One or more of the settings on this page has been disabled by the system administrator will be displayed at the bottom of the section. Conclusion If you’re tired of users changing the Wallpaper and Screen Saver, and want another way to help avoid Malware, locking down these settings can help a lot. Again, before making any changes to the Registry, make sure to back it up. These settings should work in Vista and XP as well. Similar Articles Productive Geek Tips Save 1-4% More Battery Life With Windows Vista Battery SaverCustomize Your Windows Vista Logon ScreenEnable "Ubuntu Style" Logons in Windows VistaManage the Delete Confirmation Dialog box in Windows 7Dual Monitors: Use a Different Wallpaper on Each Desktop TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Acronis Online Backup DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows Fun with 47 charts and graphs Tomorrow is Mother’s Day Check the Average Speed of YouTube Videos You’ve Watched OutlookStatView Scans and Displays General Usage Statistics How to Add Exceptions to the Windows Firewall Office 2010 reviewed in depth by Ed Bott

    Read the article

  • Cannot locate Ubuntu Software Center via Firefox to open APT links

    - by Bobby Phoenix
    I'm trying to locate where the Ubuntu Software Center is to choose as default for handling APT links in Firefox. I can click on the links, and I get the pop-up, but Ubuntu Software Center is not there. I tried to choose an application through Firefox's settings and through the pop-up, but I don't know the path. As you can see by my screen shot for the pop-up I chose the wrong one as that doesn't work. What is the correct path I need to select the correct file? EDIT - I added a fourth screen shot. I don't have it in that location. This is what I have. My view is in ABC order, and it's showing hidden files.

    Read the article

  • Integrating Code Metrics in TFS 2010 Build

    - by Jakob Ehn
    The build process template and custom activity described in this post is available here: http://cid-ee034c9f620cd58d.office.live.com/self.aspx/BlogSamples/CodeMetricsSample.zip Running code metrics has been available since VS 2008, but only from inside the IDE. Yesterday Microsoft finally releases a Visual Studio Code Metrics Power Tool 10.0, a command line tool that lets you run code metrics on your applications.  This means that it is now possible to perform code metrics analysis on the build server as part of your nightly/QA builds (for example). In this post I will show how you can run the metrics command line tool, and also a custom activity that reads the output and appends the results to the build log, and also fails he build if the metric values exceeds certain (configurable) treshold values. The code metrics tool analyzes all the methods in the assemblies, measuring cyclomatic complexity, class coupling, depth of inheritance and lines of code. Then it calculates a Maintainability Index from these values that is a measure f how maintanable this method is, between 0 (worst) and 100 (best). For information on hwo this value is calculated, see http://blogs.msdn.com/b/codeanalysis/archive/2007/11/20/maintainability-index-range-and-meaning.aspx. After this it aggregates the information and present it at the class, namespace and module level as well. Running Metrics.exe in a build definition Running the actual tool is easy, just use a InvokeProcess activity last in the Compile the Project sequence, reference the metrics.exe file and pass the correct arguments and you will end up with a result XML file in the drop directory. Here is how it is done in the attached build process template: In the above sequence I first assign the path to the code metrics result file ([BinariesDirectory]\result.xml) to a variable called MetricsResultFile, which is then sent to the InvokeProcess activity in the Arguments property. Here are the arguments for the InvokeProcess activity: Note that we tell metrics.exe to analyze all assemblies located in the Binaries folder. You might want to do some more intelligent filtering here, you probably don’t want to analyze all 3rd party assemblies for example. Note also the path to the metrics.exe, this is the default location when you install the Code Metrics power tool. You must of course install the power tool on all build servers. Using the standard output logging (in the Handle Standard Output/Handle Error Output sections), we get the following output when running the build: Integrating Code Metrics into the build Having the results available next to the build result is nice, but we want to have results integrated in the build result itself, and also to affect the outcome of the build. The point of having QA builds that measure, for example, code metrics is to make it very clear how the code being built measures up to the standards of the project/company. Just having a XML file available in the drop location will not cause the developers to improve their code, but a (partially) failing build will! To do this, we need to write a custom activity that parses the metrics result file, logs it to the build log and fails the build if the values frfom the metrics is below/above some predefined treshold values. The custom activity performs the following steps Parses the XML. I’m using Linq 2 XSD for this, since the XML schema for the result file is available, it is vey easy to generate code that lets you query the structure using standard Linq operators. Runs through the metric result hierarchy and logs the metrics for each level and also verifies maintainability index and the cyclomatic complexity with the treshold values. The treshold values are defined in the build process template are are sent in as arguments to the custom activity If the treshold values are exceeded, the activity either fails or partially fails the current build. For more information about the structure of the code metrics result file, read Cameron Skinner's post about it. It is very simpe and easy to understand. I won’t go through the code of the custom activity here, since there is nothing special about it and it is available for download so you can look at it and play with it yourself. The treshold values for Maintainability Index and Cyclomatic Complexity is defined in the build process template, and can be modified per build definition: I have taken the default value for these settings from my colleague Terje Sandström post on Code Metrics - suggestions for approriate limits. You’ll notice that this is quite an improvement compared to using code metrics inside the IDE, where Red/Yellow/Green limits are fixed (and the default values are somewaht strange, see Terjes post for a discussion on this) This is the first version of the code metrics integration with TFS 2010 Build, I will proabably enhance the functionality and the logging (the “tree view” structure in the log becomes quite hard to read) soon. I will also consider adding it to the Community TFS Build Extensions site when it becomes a bit more mature. Another obvious improvement is to extend the data warehouse of TFS and push the metric results back to the warehouse and make it visible in the reports.

    Read the article

  • Plan your SharePoint 2010 Content Type Hub carefully

    - by Wayne
    Currently setting up a new environment on SharePoint 2010 (which was made available for download yesterday if anyone missed that :-). One of the new features of SharePoint 2010 is to set up a Content Type Hub (which is a part of the Metadata Service Application), which is a hub for all Content Types that other Site Collections can subscribe to. That is you only need to manage your content types in one location. Setting up the Content Type Hub is not that difficult but you must make it very careful to avoid a lot of work and troubleshooting. Here is a short tutorial with a few tips and tricks to make it easy for you to get started. Determine location of Content Type Hub First of all you need to decide in which Site Collection to place your Content Type Hub; in the root site collection or a specific one. I think using a specific Site Collection that only acts as a Content Type Hub is the best way, there are no best practice as of now. So I create a new Site Collection, at for instance http://server/sites/CTH/. The top-level site of this site collection should be for instance a Team Site. You cannot use Blank Site by default, which would have been the best option IMHO, since that site does not have the Taxonomy feature stapled upon it (check the TaxonomyFeatureStapler feature for which site templates that can be used). Configure Managed Metadata Service Application Next you need to create your Managed Metadata Service Application or configure the existing one, Central Administration > Application Management > Manage Service Applications. Select the Managed Metadata service application and click Properties if you already have created it. In the bottom of the dialog window when you are creating the service application or when you are editing the properties is a section to fill in the Content Type Hub. In this text box fill in the URL of the Content Type Hub. It is essential that you have decided where your Content Type Hub will reside, since once this is set you cannot change it. The only way to change it is to rebuild the whole managed metadata service application! Also make sure that you enter the URL correctly. I did copy and paste the URL once and got the /default.aspx in the URL which funked the whole service up. Make sure that you only use the URL to the Site Collection of the hub. Now you have to set up so that other Site Collections can consume the content types from the hub. This is done by selecting the connection for the managed metadata service application and clicking properties. A new dialog window opens and there you need to click the Consumes content types from the Content Type Gallery at nnnn. Now you are free to syndicate your Content Types from the Hub. Publish Content Types To publish a Content Type from the hub you need to go to Site Settings > Content Types and select the content type that you would like to publish. Then select Manage publishing for this content type. This takes you to a page from where you can Publish, Unpublish or Republish the content type. Once the content type is published it can take up to an hour for the subscribing Site Collections to get it. This is controlled by the Content Type Subscriber job that is scheduled to run once an hour. To speed up your publishing just go to Central Administration > Monitoring > Review Job Definitions > Content Type Subscriber and click Run now and you content type is very soon available for use. Published Content Type status You can check the status of the content type publishing in your destination site collections by selecting Site Settings > Content Type Publishing. From here you can force a refresh of all subscribed content types, see which ones that are subscribed and finally check the publishing error log. This error log is very useful for detecting errors during the publishing. For instance if you use any features such as ratings, metadata, document ids in your content type hub and your destination site collection does not have those features available this will be reported here.

    Read the article

  • Synchronising Cut-and-Paste Activities in Ubuntu One

    - by Jackson Tan
    This was posted in the Ubuntu Forums but received no response, so I'm re-posting it here (with minor updates) in hopes that it will at least get some comments. Recently, I moved a large amount of contents (a few GBs) within the Ubuntu One folder (through cut-and-paste). Then I discovered how Ubuntu One does this is to remove them on the server side and upload all the files again in the new location. Obviously, this is undesirable because of the hefty uploading involved. Worse, since I have two computers synced to the same account, it is double the amount of traffic. Each computer took about one day to finish synchronising. Firstly, can anyone confirm that this is actually what's happening when we move folders? I'm using Ubuntu 10.04, by the way. Secondly, is there a way to cut-and-paste stuff within the Ubuntu One folder without uploading again?

    Read the article

  • Windows Embedded Compact 7

    - by Valter Minute
    I’m back from Seattle where I attended the MVP Summit and presented the Windows Embedded Compact 7 training materials during the Train The Trainer in Bellevue (many thanks to all the people attending and providing great suggestions to improve the materials!). The MVP summit was a great chance to discover new things about all the different technologies, to see old friends and meet new ones. The TTT location (Microsoft training facilities at Lincoln square in Bellevue) was great and here’s the landscape that the attendees could enjoy (partially ruined by my presence in the foreground!). In the meantime Windows Embedded Compact 7 has been released: http://www.microsoft.com/windowsembedded/en-us/evaluate/windows-embedded-compact-7.aspx You can download an evaluation version and start to discover its new features (SMP support, support for 3GBs of RAM, Silverlight for Windows Embedded tools, ARM v5,v6 and v7 compilers and many more…) and, maybe, decide to attend a training about it.

    Read the article

  • UppercuT &ndash; Custom Extensions Now With PowerShell and Ruby

    Arguably, one of the most powerful features of UppercuT (UC) is the ability to extend any step of the build process with a pre, post, or replace hook. This customization is done in a separate location from the build so you can upgrade without wondering if you broke the build. There is a hook before each step of the build has run. There is a hook after. And back to power again, there is a replacement hook. If you dont like what the step is doing and/or you want to replace its entire functionality,...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Why Ubuntu Server asks to insert a CD-ROM when installed from PXE?

    - by MainMa
    I set up a PXE server which hosts both Ubuntu Desktop and Ubuntu Server. Ubuntu Desktop is installed successfully from PXE. Ubuntu Server seems to successfully load vmlinuz and initrd.gz, asks for the language, then the location, then the keyboard layout, and finally complains that it can't mount the CD-ROM: The content of /var/lib/tftpboot/pxelinux.cfg/default is the following: default ubuntu-installer/amd64/boot-screens/vesamenu.c32 menu title Ubuntu setup label ubuntu-14.04-desktop-amd64 menu label ubuntu-14.04-desktop-amd64 kernel ubuntu-14.04-desktop-amd64/vmlinuz.efi append initrd=ubuntu-14.04-desktop-amd64/initrd.lz root=/dev/nfs boot=casper netboot=nfs nfsroot=192.168.1.41:/exports/ubuntu-14.04-desktop-amd64 splash -- label ubuntu-14.04-server-amd64 menu label ubuntu-14.04-server-amd64 kernel ubuntu-14.04-server-amd64/vmlinuz append initrd=ubuntu-14.04-server-amd64/initrd.gz root=/dev/nfs boot=install netboot=nfs nfsroot=192.168.1.41:/exports/ubuntu-14.04-server-amd64 splash -- What explains the fact that it requests the CD-ROM and how to avoid it?

    Read the article

  • Loading dynamic content and rewrite URL on Hashchange event with Jquery Mobile

    - by user3611500
    I'm building a mobile version for my website using Jquery Mobile API. The framework provides automate AJAX navigation processing. But as far as i know it require "real" pages for loading purpose. What i want to do is override the automate navigation process of it and process the hashchange on my own. But i can't not rewrite the url using window.hashChange, which is running well on my non-mobile website version : $(function () { $(window).off().hashchange(function () { if (location.hash.length > 1) { PageSelect(); } }); $(window).hashchange(); }); I just only want to take advantage on jquery mobile interfaces, i don't want anything with its automate ajax navigation stuff ! I tried to disable it using ajaxEnabled() but got no luck.

    Read the article

  • Critical Patch Update for April 2010 Now Available

    - by Steven Chan
    The Critical Patch Update (CPU) for April 2010 was released on April 13, 2010. Oracle strongly recommends applying the patches as soon as possible.The Critical Patch Update Advisory is the starting point for relevant information. It includes a list of products affected, pointers to obtain the patches, a summary of the security vulnerabilities, and links to other important documents.Supported Products that are not listed in the "Supported Products and Components Affected" Section of the advisory do not require new patches to be applied.Also, it is essential to review the Critical Patch Update supporting documentation referenced in the Advisory before applying patches, as this is where you can find important pertinent information.The Critical Patch Update Advisory is available at the following location:Oracle Technology NetworkThe next four Critical Patch Update release dates are:July 13, 2010October 12, 2010January 18, 2011April 19, 2011

    Read the article

  • SQL SERVER Find Most Active Database in SQL Server DMV dm_io_virtual_file_stats

    Few days ago, I wrote about SQL SERVER Find Current Location of Data and Log File of All the Database. There was very interesting conversation in comments by blog readers. Blog reader and SQL Expert Sreedhar has very interesting DMV presented which lists the most active database in SQL Server. For quick reference he [...]...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • New Year's Resolution: Highest Availability at the Lowest Cost

    - by margaret.hamburger(at)oracle.com
    Don't miss this Webcast: Achieve 24/7 Cloud Availability Without Expensive Redundancy Event Date: 01/11/2011 10:00 AM Pacific Standard Time You'll learn how Oracle's Maximum Availability Architecture and Oracle Database 11g help you: Achieve the highest availability at the lowest cost Protect your systems from unplanned downtime Eliminate idle redundancy Register Now! var gaJsHost = (("https:" == document.location.protocol) ? "https://ssl." : "http://www."); document.write(unescape("%3Cscript src='" + gaJsHost + "google-analytics.com/ga.js' type='text/javascript'%3E%3C/script%3E")); try { var pageTracker = _gat._getTracker("UA-13185312-1"); pageTracker._trackPageview(); } catch(err) {}

    Read the article

  • Objective-c Cocos2d moving a sprite

    - by marcg11
    I hope someone knows how to do the following with cocos2d: I want a sprite to move but not in a single line by using [cocosGuy runAction: [CCMoveTo actionWithDuration:1 position:location]]; What I want is the sprite to do some kind of movements that I preestablish. For example in some point i want the sprirte to move for instance up and then down but in a curve. Do I have to do this with flash like this documents says? http://www.cocos2d-iphone.org/wiki/doku.php/prog_guide:animation Does animation in this page means moving sprites or what? thanks

    Read the article

  • Oracle... and InfiniBand.

    - by jenny.gelhausen
    Beginning Sunday, 14th March 2010 the OpenFabrics Alliance has been hosting its annual conference in Sonoma, California. On Monday morning, Tim Shetler - VP of Product Management at Oracle - addressed a conference room full to the brim with the industry's InfiniBand luminaries. That same afternoon, Sumanta Chatterjee, Senior Director of Development at Oracle, was publicly lauded by moderator Bill Boas for being a long-standing, pivotal driver and crucial member of the community. A testament to InfiniBand's building momentum, it is no surprise to find it at the core of Oracle's flagship product, the Sun Oracle Database Machine. var gaJsHost = (("https:" == document.location.protocol) ? "https://ssl." : "http://www."); document.write(unescape("%3Cscript src='" + gaJsHost + "google-analytics.com/ga.js' type='text/javascript'%3E%3C/script%3E")); try { var pageTracker = _gat._getTracker("UA-13185312-1"); pageTracker._trackPageview(); } catch(err) {}

    Read the article

  • Oracle saves with Oracle Database 11g and Advanced Compression

    - by jenny.gelhausen
    Oracle Corporation runs a centralized eBusiness Suite system on Oracle Database 11g for all its employees around the globe. This clustered Global Single Instance (GSI) has scaled seamlessly with many acquisitions over the years, doubling the number of employees since 2001 and supporting around 100,000 employees today, 24 hours a day, 7 days a week around the world. In this podcast, you'll hear from Raji Mani, IT Director for Oracle's PDIT Group, on how Oracle Database 11g and Advanced Compression is helping to save big on storage costs. var gaJsHost = (("https:" == document.location.protocol) ? "https://ssl." : "http://www."); document.write(unescape("%3Cscript src='" + gaJsHost + "google-analytics.com/ga.js' type='text/javascript'%3E%3C/script%3E")); try { var pageTracker = _gat._getTracker("UA-13185312-1"); pageTracker._trackPageview(); } catch(err) {}

    Read the article

< Previous Page | 169 170 171 172 173 174 175 176 177 178 179 180  | Next Page >