Search Results

Search found 19539 results on 782 pages for 'pretty print'.

Page 377/782 | < Previous Page | 373 374 375 376 377 378 379 380 381 382 383 384  | Next Page >

  • OFM 11g: Implementing OAM SSO with Forms

    - by olaf.heimburger
    There is some confusion about the integration of OFM 11g Forms with Oracle Access Manager 11g (OAM). Some say this does not work, some say it works, but.... Actually, having implemented it many times I belong to the later group. Here is how. Caveat Before you start installing anything, take a step back and consider your current implementation and what you really need and want to achieve. The current integration of Forms 11g with OAM 11g does not support self-service account creation and password resets from the Forms application. If you really need this, you must use the existing Oracle AS 10.1.4.3 infrastructure. On the other hand, if your user population is pretty stable, you can enjoy the latest Forms 11g with OAM 11g. Assumptions The whole process should be done in one day. I assume that all domains and instances are started during setup, if you need to restart them on demand or purpose, be sure to have proper start/stop scripts, I don't mention them. Preparation It goes without saying, that you always should do a proper backup before you change anything on your production environment. With proper backup, I also mean a tested and verified restore process. If you dared to test it before, do it now. It pays off. Requirements For OAM 11g to work properly you need a LDAP repository. For the integration of Forms 11g you need an Oracle Internet Directory (OID) configured with the Oracle AS SSO LDAP extensions. For better support I usually give the latest version a try, in this case OID 11g is a good choice.During the Installation and Integration steps we use an upgrade wizard that needs the old OID configuration on the same host but in a different ORACLE_HOME. Installation vs Configuration With OFM 11g Oracle introduced a clear separation between Installation of the binaries (the software) and the Configuration of the instances (the runtime). This is really great as you can install all the software and create new instances when needed. In the following we adhere to this scheme and install the software first and then configure the instances later. Installation Steps The Oracle documentation contains all the necessary steps for the installation of all pieces of software. But some hints help to avoid traps and pitfalls. Step 1 The Database Start the installation with the database. It is quite obvious but we need an Oracle database for all the other steps. If you have one at hand, fine. If not, just install at least a Oracle 10.2.0.4 version. This database can be on a different host. Step 2 The Repository Creation Utility The next step should be to run the Repository Creation Utility (RCU). This is a client application that just needs to connect to your database. It can be run on any host that can reach the database and is a Windows or Linux 32-bit machine. When you run it, be sure to install the OID schema and the OAM schema. If you miss one of these, you can run the RCU again to install the missing schema. Step 3 The Foundation With OFM 11g Oracle started to use WebLogic Server 11g (WLS) as its foundation for all OFM 11g installation. We therefore install it first. Depending on your operating system, it might be possible, that no native installer is available. My approach to this dilemma is to use the WLS Generic Installer for all my installations. It does not include a JDK either but if you have both for your platform you are ready to go. Step 3a The JDK To make things interesting, Oracle currently has two JDKs in its portfolio. The Sun JDK and the JRockit JDK. Both are available for a number of platforms. If you are lucky and both are available for your platform, install both in a separate directory (and not one of your ORACLE_HOMEs) each, You can use the later as you like. Step 3b Install WLS for OID and OAM With the JDK installed, we start the generic installer with java -jar wls_generic.jar.STOP! Before you do this, check the version first. It should be 1.6.0_18 or later and not the GCC one (Some Linux distros have it installed by default). To verify the version, issue a java -version command and make sure that the output does not contain the text gcj and the version matches. If this does not work, use an absolute path like /opt/java/jdk1.6.0_23/bin/java to start the installer. The installer allows you to specify a path to install the software into, say /opt/oracle/iam/11.1.1.3 for the OID and OAM installation. We will call this IAM_HOME. Step 4 Install OID Now we are ready to install OID. Start the OID installer (in the Disk1 directory) and just select the installation only step. This will install the software only and does not configure the instance. Use the IAM_HOME as the target directory. Step 5 Install SOA Suite The IAM 11g Suite uses the BPEL component of the SOA Suite 11g for its workflows. This is a pretty closed environment and not to be used for SCA Composites. We install the SOA Suite in $IAM_HOME/soa. The installer only installs the binaries. Configuration will be done later. Step 6 Install OAM Once the installation of OID and SOA is done, we are ready to install the OAM software in the same IAM_HOME. Make sure to install the OAM binaries in a directory different from the one you used during the OID and SOA installation. As before, we only install the software, the instance will be created later. Step 7 Backup the Installation At this point, I normally do a backup (or snapshot in a virtual image) of the installation. Good when you need to go back to this point. Step 8 Configure OID The software is installed and now we need instances to run it. This process is called configuration. For OID use the config.sh found in $IAM_HOME/oid/bin to start the configuration wizard. Normally this runs smoothly. If you encounter some issues check the Oracle Support site for help. This configuration will also start the OID instance. Step 9 Install the Oracle AS SSO Schema Before we install the Forms software we need to install the Oracle AS SSO Schema into the database and OID. This is a rather dangerous procedure, but fully documented in the IAM Installation Guide, Chapter 10. You should finish this in one go, do not reboot your host during the whole procedure. As a precaution, you should make a backup of the OID instance before you start the procedure. Once the backup is ready, read the chapter, including every note, carefully. You can avoid a number of issues by following all the steps and will succeed with a working solution. Step 10 Configure OAM Reached this step? Great. You are ready to create an OAM instance. Use the $IAM_HOME/iam/common/binconfig.sh for this. This will open the WLS Domain Creation Wizard and asks for the libraries to be installed. You should at least select the OAM with Database repository item. The configuration will also start the OAM instance. Step 11 Install WLS for Forms 11g It is quite tempting to install everything in one ORACLE_HOME. Unfortunately this does not work for all OFM packages. Therefore we do another WLS installation in another ORACLE_HOME. The same considerations as in step 3b apply. We call this one FORMS_HOME. Step 12 Install Forms In the FORMS_HOME we now install the binaries for the Forms 11g software. Again, this is a install only step. Configuration starts with the next step. Step 13 Configure Forms To configure Forms 11g we start the Configuration Wizard (config.sh) in FORMS_HOME/bin. This wizard should create a new WebLogic Domain and an OHS instance! Do not extend existing domains or instances! Forms should run in its own instances! When all information is supplied, the wizard will create the domain and instance and starts them automatically.Step 14 Setup your Forms SSO EnvironmentOnce you have implemented and tested your Forms 11g instance, you can configured it for SSO. Yes, this requires the old Oracle AS SSO solution, OIDDAS for creating and assigning users and SSO to setup your partner applications. In this step you should consider to create every user necessary for use within the environment. When done, do not forget to test it. Step 15 Migrate the SSO Repository Since the final goal is to get rid of the old SSO implementation we need to migrate the old SSO repository into the new OID structure. Additionally, this step will also migrate all partner application configurations into OAM 11g. Quite convenient. To do this step, you have to start the upgrade agent (ua or ua.bat or ua.cmd) on the operating system level in $IAM_HOME/bin. Once finished, this wizard will create new osso.conf files for each partner application in $IAM_HOME/upgrade/temp/oam/.Note: At the time of this writing, this step only works if everything is on the same host (ie. OID, OAM, etc.). This restriction might be lifted in later releases. Step 16 Change your OHS sso.conf and shut down OC4J_SECURITY In Step 14 we verified that SSO for our Forms environment works fine. Now, we are shutting the old system done and reconfigure the OHS that acts as the Forms entry point. First we go to the OHS configuration directory and rename the old osso.conf  to osso.conf.10g. Now we change the moduleconf/mod_osso.conf  to point to the new osso.conf file. Copy the new osso.conf  file from $IAM_HOME/upgrade/temp/oam/ to the OHS configuration directory. Restart OHS, test forms by using the same forms links. OAM should now kick in and show the login dialog to ask for your user credentials.Done. Now your Forms environment is successfully integrated with OAM 11g.Enjoy. What's Next? This rather lengthy setup is just the foundation for your growing environment of OAM 11g protections. In the next entry we will show that Forms 11g and ADF Faces 11g can use the same OAM installation and provide real single sign-on. References Nearly everything is documented. Use the documentation! Oracle® Fusion Middleware Installation Guide for Oracle Identity Management 11gR1 Oracle® Fusion Middleware Installation Guide for Oracle Identity Management 11gR1, Chapter 11-14 Oracle® Fusion Middleware Administrator's Guide for Oracle Access Manager 11gR1, Appendix B Oracle® Fusion Middleware Upgrade Guide for Oracle Identity Management 11gR1, Chapter 10   

    Read the article

  • What's The Difference Between Imperative, Procedural and Structured Programming?

    - by daniels
    By researching around (books, Wikipedia, similar questions on SE, etc) I came to understand that Imperative programming is one of the major programming paradigms, where you describe a series of commands (or statements) for the computer to execute (so you pretty much order it to take specific actions, hence the name "imperative"). So far so good. Procedural programming, on the other hand, is a specific type (or subset) of Imperative programming, where you use procedures (i.e., functions) to describe the commands the computer should perform. First question: Is there an Imperative programming language which is not procedural? In other words, can you have Imperative programming without procedures? Update: This first question seems to be answered. A language CAN be imperative without being procedural or structured. An example is pure Assembly language. Then you also have Structured programming, which seems to be another type (or subset) of Imperative programming, which emerged to remove the reliance on the GOTO statement. Second question: What is the difference between procedural and structured programming? Can you have one without the other, and vice-versa? Can we say procedural programming is a subset of structured programming, as in the image?

    Read the article

  • Multiple stores for the same niche

    - by pandronic
    I started developing a new niche of products in my country about 3 years ago. That's when I opened my first store. Everything went fine, until a year ago, when someone I thought was a friend secretly stole my idea and made his own competing store. I was pretty upset when I caught him and decided to make it as difficult as possible for him, so I made another 4 stores, trying to get him as low as possible in the search results. The new sites have similar products (although not 100% identical), slightly different titles, images and prices. They look different and are built on different e-commerce platforms. They are all hosted on the same server, have roughly the same backlinks, use the same Google account for Analytics, have the same support phone numbers etc etc. I wasn't thinking that I'm doing something fishy, so I didn't try to hide anything. Trouble is that those sites, after doing fine for a few months, dropped like bricks in search results, almost to the point that they can't be found at all. At the moment, the only site that ranks relatively well is the original one and a couple of secondary pages with no importance from one of the other sites. How did this happen? Does Google have something against this practice? Did they take action by themselves when they realized that I was trying to monopolize this niche, or did my competitor report me for some kind of webspam? And more importantly, what do I do now? Do I shutdown all but my original site and 301 redirect users to it from the others? Can I report my competitor for engaging in the same practice? (He fought back and now he has 3-4 sites, some of which still rank kind of OKish, also he has no idea about web development, SEO or marketing, he just crudely copies what I do and is slowly but surely starting to do better than me).

    Read the article

  • Big Data for Retail

    - by David Dorf
    Right up there with mobile, social, and cloud is the term "big data," which seems to be popping up lots in the press these days.  Companies like Google, Yahoo, and Facebook have popularized a new class of data technologies meant to solve the problem of processing large amounts of data quickly.  I first mentioned this in a posting back in March 2009.  Put simply, big data implies datasets so large they can't normally be processed using a standard transactional database.  The term "noSQL" is often used in this context as well. Actually, using parallel processing within the Oracle database combined with Exadata can achieve impressive results.  Look for more from Oracle at OpenWorld as hinted by Jean-Pierre Dijcks. McKinsey recently released a report on big data in which retail was specifically mentioned as an industry that can benefit from the new technologies.  I won't rehash that report because my friend Rama already did such a good job in his posting, Impact of "Big Data" on Retail. The presentation below does a pretty good job of framing the problem, although it doesn't really get into the available technologies (e.g. Exadata, Hadoop, Cassandra, etc.) and isn't retail specific. Determine the Right Analytic Database: A Survey of New Data Technologies So when a retailer asks me about big data, here's what I say:  Big data refers to a set of technologies for processing large volumes of structured and unstructured data.  Imagine collecting everything uttered by your customers on Facebook and Twitter and combining it with all the data you can find about the products you sell (e.g. reviews, images, demonstration videos), including competitive data.  Assuming you could process all that data, you could then personalize offers to specific customers based on their tastes, ensure prices are competitive, and implement better local assortments.  It's really not that far off.

    Read the article

  • CIC 2010 - Ghost Stories and Model Based Design

    - by warren.baird
    I was lucky enough to attend the collaboration and interoperability congress recently. The location was very beautiful and interesting, it was held in the mountains about two hours outside Denver, at the Stanley hotel, famous both for inspiring Steven King's novel "The Shining" and for attracting a lot of attention from the "Ghost Hunters" TV show. My visit was prosaic - I didn't get to experience the ghosts the locals promised - but interesting, with some very informative sessions. I noticed one main theme - a lot of people were talking about Model Based Design (MBD), which is moving design and manufacturing away from 2d drawings and towards 3d models. 2d has some pretty deep roots in industrial manufacturing and there have been a lot of challenges encountered in making the leap to 3d. One of the challenges discussed in several sessions was how to get model information out to the non-engineers in the company, which is a topic near and dear to my heart. In the 2D space, people without access to CAD software (for example, people assembling a product on the shop floor) can be given printouts of the design - it's not particularly efficient, and it definitely isn't very green, but it tends to work. There's no direct equivalent in the 3D space. One of the ways that AutoVue is used in industrial manufacturing is to provide non-CAD users with an easy to use, interactive 3D view of their products - in some cases it's directly used by people on the shop floor, but in cases where paper is really ingrained in the process, AutoVue can be used by a technical publications person to create illustrative 2D views that can be printed that show all of the details necessary to complete the work. Are you making the move to model based design? Is AutoVue helping you with your challenges? Let us know in the comments below.

    Read the article

  • Java crashes on lubuntu but not Ubuntu

    - by Echogene
    I have lubuntu and Ubuntu partitions on my drive. I've been having an interesting time with the new lubuntu partition. I've encountered strange things with the game Minecraft, Java and graphics drivers on the lubuntu partition. Firstly, I'll say that Minecraft runs fine at about 60fps on the Ubuntu partition with the latest drivers. (This is lower than it should be as it's a pretty decent graphics card [Radeon HD 5700].) When I first started lubuntu, I tried to see if I could get Minecraft running on Java. Java crashed when loading the main game graphics on both Sun and OpenJDK without proprietary drivers. Java also crashed on both Javas with proprietary drivers after the necessary restart. However, after disabling (with 'remove' button) the proprietary drivers with jockey-gtk in the session after the restart to install the drivers, Minecraft ran very well at ~120fps. This didn't continue after another restart, when it ran at 9fps. After failing thereafter on lubuntu to get it working at 15fps, I tried reinstalling lubuntu and installed the exact same driver (the latest one, not the one appearing on jockey) and Java versions as on Ubuntu. That is, now Ubuntu and lubuntu have the same graphics driver and Java version. Minecraft still crashes in the same way on lubuntu but works fine on Ubuntu. I would appreciate any explanation for any of these events. What differences between lubuntu and Ubuntu could cause this? Edit: After installing the 32bit driver version on lubuntu (seeing as lubuntu is 32bit), I have Java "working" for Minecraft. However, it is at <15fps again and it can't log in to servers as it takes too long.

    Read the article

  • Ubuntu backlight problem with Nvidia graphics

    - by Vladimir
    I have a laptop mySN QMG6 / Chiligreen Mobilitas NW which is Quanta TW9 barebone with intel i3 and nvidia 335m GT onboard. On ubuntu distros 10.04, 10.10, 11.04 and 11.10 i had problem with changing screen backlight with nouveau and nvidia drivers. FN+F4/F5 buttons did not change my brightness. I tried to edit xorg.conf, adding Option “RegistryDwords” “EnableBrightnessControl=1? Also tried to add some lines to grug acpi_osi="Linux" acpi_backlight=vendor Neither worked for me. Today I installed Ubuntu 12.04 beta2 and... With nouveau driver my FN key works, and changes the brightness (is it a new 3.0.22 linux kernel, or patched nouveau driver, i don't know). This is a big step forward. But, when installing proprietary nvidia driver (295.33) FN button stops working and i can't change brightness. I also tried workaround with xorg and grub with no result. Tried to install acpi from apt - no result. Is there anything left to try? I really need that nvidia driver working with FN keys, as i would like to have a working 3D acceleration. P.S. Does the nouveau driver has 3d acceleration like nvidia drivers??? If there is need to provide some log data, please write what should i print, as i'm a bit new to Ubuntu. P.P.S. Same problems i had with other Linux distros (Mint, Fedora and others) P.P.P.S. Other FN buttons work with both drivers (Mute, VOL UP/DOWN, WiFi on/off, Bluetooth, Sleep, Start/Pause, Stop, Next/Prev song)

    Read the article

  • Java Deployment Team at JavaOne 2012

    - by _chrisb
    This year the Java Deployment team has some pretty exciting sessions at JavaOne. We will be talking about a lot of new features including Java on the Mac, Java FX deployment, and bundled applications. All presentations and the booth are located at the Hilton San Francisco Union Square, 333 O'Farrell Street. Booth The Java Deployment booth is located in the Hilton San Francisco Grand Ballroom. We will available to discuss Java Deployment and answer your questions at the following days and times: Monday, October 1st 10:30 AM - 5:00 PM Tuesday, October 2nd 10:00 AM - 5:00 PM Wednesday, October 3rd 9:30 AM - 5:00 PM Sessions Java Deployment on Mac OS X - CON7488 This is a great opportunity to learn about what's new in Java for Mac. Oracle now distributes Java for Mac so there are some exciting new changes. Scott Kovatch and Chris Bensen Located in the Hilton San Francisco Imperial Ballroom B Monday, October 1, 1:00 PM - 2:00 PM Deploy Your Application with OpenJDK 7 on Mac OS X - CON8224 Learn about packaging and distributing Java applications to the Mac AppStore with step by step examples and tips. Scott Kovatch Located in the Hilton San Francisco Imperial Ballroom B Monday, October 1, 3:00 PM - 4:00 PM The Java User Experience Team Presents the Latest UI Updates - BOF3615 Discover the eye candy that the user interface experts have been working on. Jeff Hoffman and Terri Yamamoto Located in the Hilton San Francisco Imperial Ballroom B Monday, October 1, 5:30 PM - 6:15 PM Mastering Java Deployment Skills - CON7797 Find out what Java Deployment has been cooking. This is the best place to learn about self-contained application packaging. Igor Nekrestyanov and Mark Howe Located in the Hilton San Francisco Imperial Ballroom B Thursday, October 4th, 12:30 PM to 1:30 PM For those who will not be able to aqttend we will share all slides after the JavaOne. And just to make it easy to find us, here is a map: View Larger Map

    Read the article

  • SQL SERVER – 2000 – DBCC SQLPERF(waitstats) – Wait Type – Day 24 of 28

    - by pinaldave
    I have received many comments, email, suggestions and motivations for my current series of wait types and wait statistics. One of the questions which I keep on receiving almost every other day is whether all of the discussions I have presented so far are also applicable to SQL Server 2000. Additionally, I receive another question asking me if wait statistics matters in SQL Server 2000. If it is, then the asker wants to know how to measure wait types for SQL Server 2000. In SQL Server, you can run the following command to get a list of all the wait types: DBCC SQLPERF(waitstats) The query above will work in SQL Server 2005/2008/R2  because of backup compatibility. As you might have noticed, I have been discussing everything keeping SQL Server 2005+ in mind, but I have given little consideration on SQL Server 2000. However, I am pretty sure that most of the suggestions I have provided are applicable to SQL Server 2000. The wait types I have been discussing mostly exist in SQL Server 2000 as well. But the difference of the 2000 version is that it gets late recent releases, but it is worth it. Wait types are very essential to measure performance bottleneck. Because of this, I do not have to state that I am big fan of them just so I could identify performance bottleneck. Please read all the post in the Wait Types and Queue series. Note: The information presented here is from my experience and there is no way that I claim it to be accurate. I suggest reading Book OnLine for further clarification. All the discussion of Wait Stats in this blog is generic and varies from system to system. It is recommended that you test this on a development server before implementing it to a production server. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, SQL Wait Stats, SQL Wait Types, T SQL, Technology

    Read the article

  • Building Web Applications with ACT and jQuery

    - by dwahlin
    My second talk at TechEd is focused on integrating ASP.NET AJAX and jQuery features into websites (if you’re interested in Silverlight you can download code/slides for that talk here). The content starts out by discussing ScriptManager features available in ASP.NET 3.5 and ASP.NET 4 and provides details on why you should consider using a Content Delivery Network (CDN).  If you’re running an external facing site then checking out the CDN features offered by Microsoft or Google is definitely recommended. The talk also goes into the process of contributing to the Ajax Control Toolkit as well as the new Ajax Minifier tool that’s available to crunch JavaScript and CSS files. The extra fun starts in the next part of the talk which details some of the work Microsoft is doing with the jQuery team to donate template, globalization and data linking code to the project. I go into jQuery templates, data linking and a new globalization option that are all being worked on. I want to thank Stephen Walther, Dave Reed and James Senior for their thoughts and contributions since some of the topics covered are pretty bleeding edge right now.The slides and sample code for the talk can be downloaded below.     Download Slides and Samples

    Read the article

  • Oye! Help Build OTN America Latina!

    - by rickramsey
    Yes, tango is passion, but it is passion born of romance. Not passion born of lust. As it is so often portrayed today. Understand that, and you will begin to understand why life in Latin America is so rich. image courtesy of Continental Magazine. You don't often get a chance to shape the direction of a technical comunidad. Somebody else gets there first and pretty soon everyone is in a rathole about the relevance of rutabagas. Or rutabagels as my public-school-educated hijas prefer to call them. Well, OTN American Latina is just starting up. If you're a techie who speaks Spanish or Portuguese, or if you just like hanging out with techies latinoamericanos (and who doesn't?), here's how to get in on the fun: Why Portuguese Speaking Techies Should Join Why Spanish Speaking Techies Should Join And here are the sites themselves: OTN America Latina in Brazilian Portuguese OTN America Latina in Spanish If you're not sure which site to visit, just remember that Brazilian Portuguese is Spanish spoken with a little body English. Ricardo System Admin and Developer Community of the Oracle Technology Network

    Read the article

  • SharePoint Apps and Windows Azure

    - by ScottGu
    Last Monday I had an opportunity to present as part of the keynote of this year’s SharePoint Conference.  My segment of the keynote covered the new SharePoint Cloud App Model we are introducing as part of the upcoming SharePoint 2013 and Office 365 releases.  This new app model for SharePoint is additive to the full trust solutions developers write today, and is built around three core tenants: Simplifying the development model and making it consistent between the on-premises version of SharePoint and SharePoint Online provided with Office 365. Making the execution model loosely coupled – and enabling developers to build apps and write code that can run outside of the core SharePoint service. This makes it easy to deploy SharePoint apps using Windows Azure, and avoid having to worry about breaking SharePoint and the apps within it when something is upgraded.  This new loosely coupled model also enables developers to write SharePoint applications that can leverage the full capabilities of the .NET Framework – including ASP.NET Web Forms 4.5, ASP.NET MVC 4, ASP.NET Web API, EF 5, Async, and more. Implementing this loosely coupled model using standard web protocols – like OAuth, JSON, and REST APIs – that enable developers to re-use skills and tools, and easily integrate SharePoint with Web and Mobile application architectures. A video of my talk + demos is now available to watch online: In the talk I walked through building an app from scratch – it showed off how easy it is to build solutions using new SharePoint application, and highlighted a web + workflow + mobile scenario that integrates SharePoint with code hosted on Windows Azure (all built using Visual Studio 2012 and ASP.NET 4.5 – including MVC and Web API). The new SharePoint Cloud App Model is something that I think is pretty exciting, and it is going to make it a lot easier to build SharePoint apps using the full power of both Windows Azure and the .NET Framework.  Using Windows Azure to easily extend SaaS based solutions like Office 365 is also a really natural fit and one that is going to offer a bunch of great developer opportunities.  Hope this helps, Scott  P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • Silverlight Cream for November 24, 2011 -- #1173

    - by Dave Campbell
    In this Thanksgiving Day Issue: Andrea Boschin, Samidip Basu, Ollie Riches, WindowsPhoneGeek, Sumit Dutta, Dhananjay Kumar, Daniel Egan, Doug Mair, Chris Woodruff, and Debal Saha.Happy Thanksgiving Everybody! Above the Fold: Silverlight: "Silverlight CommandBinding with Simple MVVM Toolkit" Debal Saha WP7: "How many pins can Bing Maps handle in a WP7 app - part 3" Ollie Riches Shoutouts: Michael Palermo's latest Desert Mountain Developers is up Michael Washington's latest Visual Studio #LightSwitch Daily is up From SilverlightCream.com:Windows Phone 7.5 - Play with musicAndrea Boschin's latest WP7 post is up on SilverlightShow... he's talking about the improvements in the music hub and also the programmability of musicOData caching in Windows PhoneSamidip Basu has an OData post up on SilverlightShow also, and he's talking about data caching strategies on WP7How many pins can Bing Maps handle in a WP7 app - part 3Ollie Riches has part 3 of his series on Bing Maps and pins... sepecifically how to deal with a large number of them... after going through discussing pins, he is suggesting using a heat map which looks pretty darn good, and renders fast... except when on a device :(Improvements in the LongListSelector Selection with Nov `11 release of WP ToolkitWindowsPhoneGeek's latest is this tutorial on the LongListSelector in the WP Toolkit... check out the previous info in his free eBook to get ready then dig into this tutorial for improvements in the control.Part 25 - Windows Phone 7 - Device StatusSumit Dutta's latest post is number 25 in his WP7 series, and time out he's digging into device status in the Microsoft.Phone.Info namespaceVideo on How to work with Picture in Windows Phone 7Dhananjay Kumar's latest video tutorial on WP7 is up, and he's talking about working with Photos.Live Tiles–Windows Phone WorkshopDaniel Egan has the video up of a Windows Phone Workshop done earlier this week on Live Tiles31 Days of Mango | Day #15: The Progress BarDoug Mair shares the show with Jeff Blankenburg in Jeff's Day 15 in his 31 Day quest of Mango, talking about the progressbar: Indeterminate and Determinate Modes abound31 Days of Mango | Day #14: Using ODataChris Woodruff has a guest spot on Jeff Blankenburg's 31 Days series with this post on OData... long detailed tutorial with all the codeSilverlight CommandBinding with Simple MVVM ToolkitDebal Saha has a nice detailed tutorial up on CommandBinding.. he's using the SimpleMVVM Toolkit and shows downloading and installing itStay in the 'Light!Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCreamJoin me @ SilverlightCream | Phoenix Silverlight User GroupTechnorati Tags:Silverlight    Silverlight 3    Silverlight 4    Windows PhoneMIX10

    Read the article

  • Object oriented design importance

    - by user5507
    I started studying Object Oriented Design and Modelling using the this book by James Rumbaugh. It uses a tool called Object Modeling Technique (OMT). I have certain newbie questions. I searched the net, but couldn't get answers The book is pretty old. Don't know why the school told me to learn this. I know OMT is a predecessor of the Unified Modeling Language (UML). So its a waste? Whether the concepts change very much when we move from OMT to UML? I know OMT has Object, Dynamic and Functional Model. Wikipedia says UML is compatible with OMT and UML is a model too. As per wikipedia the UML models are Static and Dynamic and they are represented by different diagrams like class, object, activity, sequence..... I couldn't find the equivalence of this in OMT. I read that there are many object oriented development methods like OMT, Booch,.... Which one is used by Industry ? Where could I get a comparison of different Object oriented development methods?

    Read the article

  • Join Me at JavaOne!

    - by HecklerMark
    JavaOne 2012 is less than a week away! If you've already made plans to be there, you're probably getting pretty excited about it already...and if not, what are you waiting for?!? Before I get to the session information, I want to point out that qualified students get free admission to JavaOne, so if you are (or know) a CS or IT (or other tech-leaning) student who might like to attend, follow the link and start making plans. There is so much there to learn and experience. I'm happy to say I'll be a small part of the festivities. I'll be leading the following session: CON3519 - Building Hybrid Cloud Apps: Local Databases + The Cloud = Extreme Versatility In this session, learn how to design and develop applications that leverage both local storage and the cloud, maximizing the strengths of each. Using NetBeans, JavaServer Faces 2.0, GlassFish Server technology, JavaFX 2, Oracle Database, and Evernote, rapidly create prototypical applications that can be deployed in various environments and scaled up/out with enterprise cloud solutions.  As a contributor to the JFXtras project, I also hope to attend the following "Birds Of a Feather" (BOF) session led by Gerrit Grunwald and Stephen Chin: BOF5503 - JFXtras Super Happy Dev BOF JFXtras, the open source JavaFX control and extensions project, is back for JavaFX 2.0. In this session, you will learn about the latest changes in JFXtras 2.0, including new components, controls, and features that integrate with the JavaFX 2.0 libraries. Expect to meet the JFXtras core team members as well as other interesting client RIA implementers and developers. Now that JavaFX is coded in Java, a few server-side hackers may even be let in the door. If you're there, please stop by and introduce yourself! And to follow along with my J1 travels or keep in contact afterward, please follow me on Twitter or connect via G+ or Facebook (links in panel to right). Hope to see you there, but either way, keep the Java flowing! All the best,Mark 

    Read the article

  • The internal storage of a DATETIME2 value

    - by Peter Larsson
    Today I went for investigating the internal storage of DATETIME2 datatype. What I found out was that for a datetime2 value with precision 0 (seconds only), SQL Server need 6 bytes to represent the value, but stores 7 bytes. This is because SQL Server add one byte that holds the precision for the datetime2 value. Start with this very simple repro declare @now datetime2(7) = '2010-12-15 21:04:03.6934231'   select  cast(cast(@now as datetime2(0)) as binary(7)),         cast(cast(@now as datetime2(1)) as binary(7)),         cast(cast(@now as datetime2(2)) as binary(7)),         cast(cast(@now as datetime2(3)) as binary(8)),         cast(cast(@now as datetime2(4)) as binary(8)),         cast(cast(@now as datetime2(5)) as binary(9)),         cast(cast(@now as datetime2(6)) as binary(9)),         cast(cast(@now as datetime2(7)) as binary(9)) Now we are going to copy and paste these binary values and investigate which value is representing what time part. Prefix  Ticks       Ticks         Days    Days    Original value ------  ----------  ------------  ------  ------  -------------------- 0x  00  442801             75844  A8330B  734120  0x00442801A8330B 0x  01  A5920B            758437  A8330B  734120  0x01A5920BA8330B  0x  02  71BA73           7584369  A8330B  734120  0x0271BA73A8330B 0x  03  6D488504        75843693  A8330B  734120  0x036D488504A8330B 0x  04  46D4342D       758436934  A8330B  734120  0x0446D4342DA8330B 0x  05  BE4A10C401    7584369342  A8330B  734120  0x05BE4A10C401A8330B 0x  06  6FEBA2A811   75843693423  A8330B  734120  0x066FEBA2A811A8330B 0x  07  57325D96B0  758436934231  A8330B  734120  0x0757325D96B0A8330B Let us use the following color schema Red - Prefix Green - Time part Blue - Day part What you can see is that the date part is equal in all cases, which makes sense since the precision doesm't affect the datepart. What would have been fun, is datetime2(negative) just like round accepts a negative value. -1 would mean rounding to 10 second, -2 rounding to minute, -3 rounding to 10 minutes, -4 rounding to hour and finally -5 rounding to 10 hour. -5 is pretty useless, but if you extend this thinking to -6, -7 and so on, you could actually get a datetime2 value which is accurate to the month only. Well, enough ranting about this. Let's get back to the table above. If you add 75844 second to midnight, you get 21:04:04, which is exactly what you got in the select statement above. And if you look at it, it makes perfect sense that each following value is 10 times greater when the precision is increased one step too. //Peter

    Read the article

  • SQL SERVER – Winners – Contest Win Joes 2 Pros Combo (USD 198)

    - by pinaldave
    Earlier this week we had contest ran over the blog where we are giving away USD 198 worth books of Joes 2 Pros. We had over 500+ responses during the five days of the contest. After removing duplicate and incorrect responses we had a total of 416 valid responses combined total 5 days. We got maximum correct answer on day 2 and minimum correct answer on day 5. Well, enough of the statistics. Let us go over the winners’ names. The winners have been selected randomly by one of the book editors of Joes 2 Pros. SQL Server Joes 2 Pros Learning Kit 5 Books Day 1 Winner USA: Philip Dacosta India: Sandeep Mittal Day 2 Winner USA: Michael Evans India: Satyanarayana Raju Pakalapati Day 3 Winner USA: Ratna Pulapaka India: Sandip Pani Day 4 Winner USA: Ramlal Raghavan India: Dattatrey Sindol Day 5 Winner USA: David Hall India: Mohit Garg I congratulate all the winners for their participation. All of you will receive emails from us. You will have to reply the email with your physical address. Once you receive an email please reply within 3 days so we can ship the 5 book kits to you immediately. Bonus Winners Additionally, I had announced that every day I will select a winner from the readers who have left comments with their favorite blog post. Here are the winners with their favorite blog post. Day 1: Prasanna kumar.D [Favorite Post] Day 2: Ganesh narim [Favorite Post] Day 3: Sreelekha [Favorite Post] Day 4: P.Anish Shenoy [Favorite Post] Day 5: Rikhil [Favorite Post] All the bonus winners will receive my print book SQL Wait Stats if your shipping address is in India or Pluralsight Subscription if you are outside India. If you are not winner of the contest but still want to learn SQL Server you can get the book from here. Amazon | 1 | 2 | 3 | 4 | 5 | Flipkart | Indiaplaza Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Joes 2 Pros, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Tips on debugging collections

    - by Vincent Grondin
    The "Quick Watch" feature of Visual Studio is an awesome tool when debugging your stuff...  I use it all the time and quite often I end up exploring hashtables or lists of all sorts...  One thing I hate is when I have to explore Collections...  Good god did I lose time trying to find the inner member that contains my stuff when exploring collections...  Most collections have the inside member that you can search for and find and explore to see the list of things you wanted to look at.  Something in the likes of this.    I've known a little trick for a while now and I give it to everyone I end up debugging something with so I figured that probably not many people know about this...  Here's the tip...  Send the collection into an ArrayList in the QuickWatch window!  Yes, you heard me right, just type    new ArrayList(yourcollectionhere) in my case:    new ArrayList(this.Controls) in the expresion textbox and here's the result when you hit reevaluate! Pretty neat trick to make your debugging experience less of a pain when dealing with collections...    Happy debugging all !

    Read the article

  • make-like build tools for data?

    - by miku
    Make is a standard tools for building software. But make decides whether a target needs to be regenerated by comparing file modification times. Are there any proven, preferably small tools that handle builds not for software but for data? Something that regenerates targets not only on mod times but on certain other properties (e.g. completeness). (Or alternatively some paper that describes such a tool.) As illustration: I'd like to automate the following process: get data (e.g. a tarball) from some regularly updated source copy somewhere if it's not there (based e.g. on some filename-scheme) convert the files to different format (but only if there aren't successfully converted ones there - e.g. from a previous attempt - custom comparison routine) for each file find a certain data element and fetch some additional file from say an URL, but only if that hasn't been downloaded yet (decide on existence of file and file "freshness") finally compute something (e.g. word count for something identifiable and store it in the database, but only if the DB does not have an entry for that exact ID yet) Observations: there are different stages each stage is usually simple to compute or implement in isolation each stage may be simple, but the data volume may be large each stage may produce a few errors each stage may have different signals, on when (re)processing is needed Requirements: builds should be interruptable and idempotent (== robust) when interrupted, already processed objects should be reused to speedup the next run data paths should be easy to adjust (simple syntax, nothing new to learn, internal dsl would be ok) some form of dependency graph, that describes the process would be nice for later visualizations should leverage existing programs, if possible I've done some research on make alternatives like rake and have worked a lot with ant and maven in the past. All these tools naturally focus on code and software build, not on data builds. A system we have in place now for a task similar to the above is pretty much just shell scripts, which are compact (and are a ok glue for a variety of other programs written in other languages), so I wonder if worse is better?

    Read the article

  • Help implementing virtual d-pad

    - by Moshe
    Short Version: I am trying to move a player around on a tilemap, keeping it centered on its tile, while smoothly controlling it with SneakyInput virtual Joystick. My movement is jumpy and hard to control. What's a good way to implement this? Long Version: I'm trying to get a tilemap based RPG "layer" working on top of cocos2d-iphone. I'm using SneakyInput as the input right now, but I've run into a bit of a snag. Initially, I followed Steffen Itterheim's book and Ray Wenderlich's tutorial, and I got jumpy movement working. My player now moves from tile to tile, without any animation whatsoever. So, I took it a step further. I changed my player.position to a CCMoveTo action. Combined with CCfollow, my player moves pretty smoothly. Here's the problem, though: Between each CCMoveTo, the movement stops, so there's a bit of a jumpiness introduced between movements. To deal with that, I changed my CCmoveTo into a CCMoveBy, and instead of running it once, I decided to have it CCRepeatForever. My plan was to stop the repeating action whenever the player changed directions or released the d-pad. However, when the movement stops, the player is not necessarily centered along the tiles, as it should be. To correctly position the player, I use a CCMoveTo and get the closest position that would put the player back into the proper position. This reintroduces an earlier problem of jumpiness between actions. What is the correct way to implement a smooth joystick while smoothly animating the player and keeping it on the "grid" of tiles? Edit: It turns out that this was caused by a "Bug Fix" in the cocos2d engine.

    Read the article

  • UEFI/GPT Win 7 Load Failure in Dual Boot and no GRUB2 [Ubuntu 12.04]

    - by cristian_jordache
    Configuration: MBB: ASRock X79 Extreme6 Win 7 installed on a INTEL 40GB SSD (GPT partitioned) Ubuntu 14.04 on a CORSAIR 30GB SSD (Ext4 and SWAP) I had Windows 7 installed previously in UEFI mode, using 3 partitions (GPT) and works fine if left alone. In UEFI BIOS settings I can see sometimes a "Windows Boot Manager" and other times (?) a "UEFI Intel" entry for INTEL HDD and Windows will boot properly selecting the one available at that time. I installed Ubuntu 14.04 after Win 7 w/o changing any UEFI BIOS settings and it works fine only if the BIOS is set w/ the Ubuntu partition as the first drive to boot, in AHCI mode. If both SSD drives are connected, the Win7 Intel boot drive can be chosen as first boot device but only as an "AHCI Intel drive" (No "Windows Boot Manager" nor "UEFI Intel device" options available in BIOS Boot menu) and Win7 will not load properly as long as the Ubuntu Crucial SSD is NOT PHYSICALLY DISCONNECTED. Windows will try, start booting for few seconds but will fail replacing Win7 logo and that startup animation with w/ the "old" white progress bar and then and will notify that there is a issue and prompt the user to try to Load Win 7 in Normal Mode again or try a Recovery Mode to fix it. If I let Windows INTEL HDD boot via BIOS/UEFI - Windows Boot manager selection, I may see the purple screen of Grub2 loaded for a while, but there's no selection for Ubuntu or Windows and/or then machine is not booting, showing a black screen and a small command prompt cursor blinking on top. So far the only option I see to have Ubuntu boot side by side w/ Win 7 is to reformat the Win7 SDD and set it boot in legacy BIOS mode with a MBR instead of GPT. Per my understanding this is a quite complex issue to fix (Rod Smith's answer was pretty helpful: UEFI boot on my Asus k52f) but any other suggestions are welcome. I find a bit odd that I can boot properly Windows7 SSD or an Ubuntu DVD using a DVD drive set in UEFI-BIOS in "AHCI mode" and w/ using "UEFI/Windows Boot Manager" booting option but I cannot boot a secondary SSD-HDD w/ Ubuntu having the same BIOS/UEFI Boot configuration. Looks like plugging the second SSD [the Ubuntu partition] is interfering with boot options in UEFI-BIOS.

    Read the article

  • Twin Cities Code Camp 8 Retrospective

    - by Lee Brandt
    I just got back (a few hours ago) from Minneapolis, where I was speaking at the Twin Cities Code Camp 8. I’d never been to a Twin Cities Code Camp, and I have always heard such great things, so I submitted and got accepted to speak. The conference (what I got to see) was great. My talk was pretty short on people, but there are many reasons for that. First, I spoke opposite Donn Felker (speaking about developing for Android) and Keith Dahlby (speaking about Dynamic .NET). So of course, my talk is going to be empty. How could I compete with that? Plus, my talk was about software process improvement, specifically about how our process has evolved. Maybe not the smartest idea to submit to talk about software process at a developer’s conference. The people who DID attend however, seemed to really enjoy the talk. There was good interaction and good, thoughtful questions. So the attendees seemed engaged. I actually did get a chance to go to one session. I went and saw Javier Lozano talk about Open source tools for ASP.NET MVC. I am hip-deep in MVC stuff right now and getting up to speed on MVC 2 as well. I learned about MVC Turbine, Javier’s Open Source project. I will definitely be adding it to my MVC arsenal. Thanks Javier! I did forget my AC adapter for my laptop and got a little lost in Minneapolis on my way to get one from MicroCenter Saturday morning, but other than that, it was a great trip. It’s a long drive, but seeing all the guys and getting two Nut & Honey rolls from Roly Poly in Eden Prarie for lunch on Saturday made the trip totally worth it. I look forward to seeing what Jason & Chris come up with for next year! Thanks for having me guys!

    Read the article

  • How to explain my 5 burnt-out years off to a new employer?

    - by user17332
    Five years ago, I lost my ability to concentrate long-term, and therefore ability to code with professional efficiency. I know why it happened, I understood how it happened, and on top of being able to re-create my calm and thus relaxed focus, I overcame the original (rooted in childhood) reason why my mind tilted on the overall situation back then; My understanding isn't rooted in words that a psychologist told me, I actually grokked them first-hand. I'm pretty much confident to be able to churn out productivity, possibly even more so than pre-burnout. I also never lost my interest in code nor did I stray from trying to get my abilities back; I kept my knowledge up to date (I could always relatively painlessly learn things coding-related, just not apply them) and thus can say that I'm a better developer than before, even if my average LOC-count over those years is abysmally low. On the other hand, now I have a biography that includes more time on the dole than in a job. What would convince you, as an employer, to give my application a chance? I don't believe I should just keep the whole topic out of it.

    Read the article

  • How do I install the newest Flash beta for Minefield on 64-bit Ubuntu?

    - by Øsse
    Hi, I'm using a fully updated Ubuntu 10.10 64-bit which is pretty much bog standard except I'm using Minefield from the Mozilla daily PPA in addition to Firefox as provided by Ubuntu. I want to try the newest beta of Flash (10.3 as of writing). The installation instructions simply say "drop libflashplayer.so into the plugin folder of your browser". This the 32-bit version. Currently I'm using Flash as provided by the package flashplugin-installer (ver. 10.2.152.27ubuntu0.10.10.1). Going to about:plugins in Minefield/Firefox says the version of Flash I'm running is 10.2 r152 and the file responsible is npwrapper.libflashplayer.so. I have two files with that name on my system. One is /usr/share/ubufox/plugins/npwrapper.libflashplayer.so which is a broken link to /usr/lib/flashplugin-installer/npwrapper.libflashplayer.so. The other is /var/lib/flashplugin-installer/npwrapper.libflashplayer.so (note var instead of usr). I also have a file simply called libflashplayer.so in /usr/lib/flashplugin-installer/. So it seems Firefox/Minefield gets its Flash plugin from a file that doesn't exist, and replacing libflashplayer.so with the one in the archive from Adobe has no effect. Since I want to try the 32-bit version I have to use the wrapper. The only way I know how is through the flashplugin-installer package. How would I go about installing the newest 32-bit beta Flash if possible at all? And where is "the plugin folder of my browser"?

    Read the article

  • Caching in the .NET Stack: Inside-Out

    - by Elton Stoneman
    Originally posted on: http://geekswithblogs.net/EltonStoneman/archive/2013/06/28/caching-in-the-.net-stack-inside-out.aspxI'm delighted to have my first course published on Pluralsight - Caching in the .NET Stack: Inside-out.   It's a pretty comprehensive look at caching in .NET solutions. The first half covers using local, remote and persistent cache stores inside the solution, including the .NET MemoryCache, NCache Express, AppFabric Caching, memcached, Azure Table Storage and local disk stores. The second half covers caching outside the solution in HTTP clients and proxies, and how to set up ASP.NET WebForms, MVC, Web API and WCF projects to use HTTP validation and expiration caching.   The course takes a hands-on approach, starting with a distributed solution that has no caching, analysing key points which can benefit from caching, and adding different types of cache. At the end of the course I run through a set of before and after performance tests, stressing the solution under load. Without caching and with 60 concurrent users the page response time maxes out at 18 seconds - with caching that falls to 2 seconds, so it's a huge improvement from very little effort. I’d be glad to hear feedback if you watch the course, especially if it’s as positive as my editor’s.

    Read the article

< Previous Page | 373 374 375 376 377 378 379 380 381 382 383 384  | Next Page >