Search Results

Search found 4853 results on 195 pages for 'continuous integration'.

Page 165/195 | < Previous Page | 161 162 163 164 165 166 167 168 169 170 171 172  | Next Page >

  • Is it possible to start (and stop) a thread inside a DLL?

    - by Jerry Dodge
    I'm pondering some ideas for building a DLL for some common stuff I do. One thing I'd like to check if it's possible is running a thread inside of a DLL. I'm sure I would be able to at least start it, and have it automatically free on terminate (and make it forcefully terminate its self) - that I can see wouldn't be much of a problem. But once I start it, I don't see how I can continue communicating with it (especially to stop it) mainly because each call to the DLL is unique (as far as my knowledge tells me) but I also know very little of the subject. I've seen how in some occasions, a DLL can be loaded at the beginning and released at the end when it's not needed anymore. I have 0 knowledge or experience with this method, other than just seeing something related to it, couldn't even tell you what or how, I don't remember. But is this even possible? I know about ActiveX/COM but that is not what I want - I'd like just a basic DLL that can be used across languages (specifically C#). Also, if it is possible, then how would I go about doing callbacks from the DLL to the app? For example, when I start the thread, I most probably will assign a function (which is inside the EXE) to be the handler for the events (which are triggered from the DLL). So I guess what I'm asking is - how to load a DLL for continuous work and release it when I'm done - as opposed to the simple method of calling individual functions in the DLL as needed. In the same case - I might assign variables or create objects inside the DLL. How can I assure that once I assign that variable (or create the object), how can I make sure that variable or object will still be available the next time I call the DLL? Obviously it would require a mechanism to Initialize/Finalize the DLL (I.E. create the objects inside the DLL when the DLL is loaded, and free the objects when the DLL is unloaded). EDIT: In the end, I will wrap the DLL inside of a component, so when an instance of the component is created, DLL will be loaded and a corresponding thread will be created inside the DLL, then when the component is free'd, the DLL is unloaded. Also need to make sure that if there are for example 2 of these components, that there will be 2 instances of the DLL loaded for each component. Is this in any way related to the use of an IInterface? Because I also have 0 experience with this. No need to answer it directly with sample source code - a link to a good tutorial would be great.

    Read the article

  • Webcast Q&A: Los Angeles Department of Building & Safety Lowers Customer Service Costs with Oracle WebCenter

    - by Kellsey Ruppel
    This week we had the fifth webcast in our WebCenter in Action webcast series, "Los Angeles Department of Building & Safety Lowers Customer Service Costs with Oracle WebCenter", where customers Giovani Dacumos and Minh Ong from the Los Angeles Department of Building & Safety (LADBS), and Sheetal Paranjpye and Rajiv Desai from Oracle Partner 3Di, shared how Oracle WebCenter is powering LADBS' externally facing website and providing a superior self-service experience for their customers. We asked the speakers to provide some dialogue for Q&A.   Giovani Dacumos, Director of Systems and Minh Ong, LADBS Q: Did you run into any issues when integrating all of the different applications together?A: Yes. We did have issues integrating a secure sign on between the portal and other legacy applications. We used portlets and iframes to overcome those.  This is a new technology for us and we are also learning as we go so there were a lot of challenges in developing and implementing our vision. Q: What has been the biggest benefit your end users have seen?A: The biggest benefit for our ends users is ease-of-use. We've given them a system that provided a new and improved source of information, as well as a very organized flow of transaction processing. It has made our online service very user friendly. Q: Was there any resistance internally when implementing the solution? If so, how did you overcome that?A: There was no internal resistance during the implementation, only challenges. As mentioned earlier, this is a new technology for us. We've come across issues that needed assistance from Oracle. Working with 3Di and Oracle has helped us tremendously to find solutions to our implementation issues. Q: Given the performance, what do you estimate to be the top end capacity of the system? A: With the current performance and architecture we have, we are able to support approx 300-400 concurrent users.  We would need more hardware to support additional user load. Q: What's the overview or summary of feedback from the users interacting with the site?A: LADBS has a wide spectrum of customers, from simple users like homeowners to large construction firms. Anything new that we offer could be a little bit challenging for some, but overall, the customers liked it. They saw a huge improvement on the usability. Q: Can you describe the impressions about the site before and after the project within LADBS?A: The old site was using old technology and it was hard for us to keep on building into it as we got more business requirements. It made our application seem a bit complicated.  It was confusing for our new customers to use and we've improved on this with the new site. It's now easier for them to complete their transactions and, at the same time, allowed us to provide more useful information. Sheetal Paranjpye and Rajiv Desai, 3Di Q: Did you run into any obstacles when implementing the solution?A: Yes we did run into some obstacles. One of the key show stoppers was the issue with portlet to portal communication. The GIS viewer (portlet) needed information to be passed  to and from Permit LA (Portal), but we were able to get everything configured and up and working quickly! Q: Was there a lot of custom work that needed to be done for this particular solution?A: We have done some customizations where workflows/ Task flows are involved.  Q: What do you think were the keys to success for rolling out WebCenter?A: Having a service oriented architecture and using portlets have been the key areas for rolling out Oracle WebCenter at LADBS. The Oracle WebCenter Content integration allows the flexibility to business users to maintain the content, which has really cut down on the reliance of IT, and employee productivity has increased as a result. If you missed the webcast, be sure to catch the replay to see a live demonstration of WebCenter in action! Los Angeles Department of Building & Safety Lowers Customer Service Costs with Oracle WebCenter from Oracle WebCenter

    Read the article

  • SOA 10g Developing a Simple Hello World Process

    - by [email protected]
    Softwares & Hardware Needed Intel Pentium D CPU 3 GHz, 2 GB RAM, Windows XP System ( Thats what i am using ) You could as well use Linux , but please choose High End RAM 10G SOA Suite from Oracle(TM) , Read Installation documents at www.Oracle.com J Developer 10.1.3.3 Official Documents at http://www.oracle.com/technology/products/ias/bpel/index.html java -version Java HotSpot(TM) Client VM (build 1.5.0_06-b05, mixed mode)BPEL Introduction - Developing a Simple Hello World Process  Synchronous BPEL Process      This Exercise focuses on developing a Synchronous Process, which mean you give input to the BPEL Process you get output immediately no waiting at all. The Objective of this exercise is to give input as name and it greets with Hello Appended by that name example, if I give input as "James" the BPEL process returns "Hello James". 1. Open the Oracle JDeveloper click on File -> New Application give the name "JamesApp" you can give your own name if it pleases you. Select the folder where you want to place the application. Click "OK" 2. Right Click on the "JamesApp" in the Application Navigator, Select New Menu. 3. Select "Projects" under "General" and "BPEL Process Project", click "OK" these steps remain same for all BPEL Projects 4. Project Setting Wizard Appears, Give the "Process Name" as "MyBPELProc" and Namespace as http://xmlns.james.com/ MyBPELProc, Select Template as "Synchronous BPEL Process click "Next" 5. Accept the input and output schema names as it is, click "Finish" 6. You would see the BPEL Process Designer, some of the folders such as Integration content and Resources are created and few more files 7. Assign Activity : Allows Assigning values to variables or copying values of one variable to another and also do some string manipulation or mathematical operations In the component palette at extreme right, select Process Activities from the drop down, and drag and drop "Assign" between "receive Input" and "replyOutput" 8. You can right click and edit the Assign activity and give any suitable name "AssignHello", 9. Select "Copy Operation" Tab create "Copy Operation" 10. In the From variables click on expression builder, select input under "input variable", Click on insert into expression bar, complete the concat syntax, Note to use "Ctrl+space bar" inside expression window to Auto Populate the expression as shown in the figure below. What we are actually doing here is concatenating the String "Hello ", with the variable value received through the variable named "input" 11. Observe that once an expression is completed the "To Variable" is assigned to a variable by name "result" 12. Finally the copy variable looks as below 13. It's the time to deploy, start the SOA Suite 14. Establish connection to the Server from JDeveloper, this can be done adding a New Application Server under Connection, give the server name, username and password and test connection. 15. Deploy the "MyBPELProc" to the "default domain" 16. http://localhost:8080/ allows connecting to SOA Suite web portal, click on "BPEL Control" , login with the username "oc4jadmin" password what ever you gave during installation 17. "MyBPELProc" is visisble under "Deployed BPEL Processes" in the "Dashboard" Tab, click on the it 18. Initiate tab open to accept input, enter data such as input is "James" click on "Post XML Button" 19. Click on Visual Flow 20. Click on receive Input , it shows "James" as input received 21. Click on reply Output, it shows "Hello James" so the BPEL process is successfully executed. 22. It may be worth seeing all the instance created everytime a BPEL process is executed by giving some inputs. Purge All button allows to delete all the unwanted previous instances of BPEL process, dont worry it wont delete the BPEL process itself :-) 23. It may also be some importance to understand the XSD File which holds input & output variable names & data types. 24. You could drag n drop variables as elements over sequence at the designer or directly edit the XML Source file. 

    Read the article

  • Blogging: MacJournal & Windows Live Writer

    - by Jeff Julian
    One thing I have learned about using a Mac is that Apple does not produce very many free applications. The ones they do are typically not full featured and to get the full feature you need to buy their upgraded version. For example, when it comes to Photo editing and cataloging, iPhoto is not a solution for large files or RAW processing, you need Aperture which is a couple hundred dollars. I am not complaining because I like it when an application has a product team who generates revenue with it, because the chance of them being around longer seems to be higher. What is my point in all of this? Apple does not produce a product for blogging/journaling like Microsoft does with Windows Live Writer. I love Windows Live Writer. If you are on a Windows box, it is a required tool in your toolbox if you publish to a blog. The cleanness of the interface, integration with most blog APIs and ability to Save Local or Publish as a Draft make capturing your thoughts for publishing now or later a very easy task. My hope is that Microsoft will port it to the Mac, but I don’t believe that will ever happen as it is not a revenue generating product and Microsoft doesn’t often port to a Mac besides Remote Desktop Connection and MSN Messenger. For my configuration I used to use only Boot Camp on my two MacBook Pros I have owned in the past three years because I’m a PC, but after four different rebuilds (not typically due to Windows, but Boot Camp or Parallels) I decided to move off the Boot Camp platform and to VMWare Fusion. This is a complete separate blog post that I should spec out in MacJournal, but I now always boot into the Mac OS and use Fusion for my AJI Software VM or my client’s VMs. It just seems to work better for me and I have a very nice way to backup my Windows environments with VMWare.Needless to say, there was need in my new laptop configuration for a blogging tool that worked natively on a Mac. I don’t like to power up my machine for writing a document or working on an image and need to boot up a VM just so I can use Windows. Some would say why not just use a Windows laptop and put the MBP on eBay? It is just a preference and right now, I like the Mac OS for day to day work. So in comes MacJournal, part of the current MacHeist package for $19.95 (MacJournal is normally $39.95). This product is definitely not WLW, but WLW is missing some features I like in MacJournal. I hope the price point comes down on MacJournal cause I could see paying $19.95 for it, but it is always hard for me to buy a piece of software for $39.95 when I can use something else. But I am a cheapskate when it comes to software packages. I suggest if you are using a Mac to drop what you are doing pick up the MacHeist bundle today before it is over, but if you are reading this later, than download the trial and see if MacJournal is a solution for you. If you have any other suggestions that are as nice or cheaper, please comment.Product LinksMacJournal by Mariners Software $39.95 (part of MacHeist bundle for $19.95 with only one day left)Windows Live Writer by MicrosoftThis post was created using MacJournal.[Update: The joys of formatting. Make sure if you are a Geekswithblogs.net member that you use this configuration to setup the Metablog formatting of paragraphs correctly]

    Read the article

  • Interview questions about ASP.NET Web services.

    - by Jalpesh P. Vadgama
    I have seen there are lots of myth’s about asp.net web services in fresher level asp.net developers. So I decided to write a blog post about asp.net web services interview questions. Because I think this is the best way to reach fresher asp.net developers. Followings are few questions about asp.net web services. 1) What is asp.net web services? Ans: Web services are used to support http requests that formatted using xml,http and SOAP syntax. They interact with through standards xml messages through Soap. They are used to support interoperability. It has .asmx extension and .NET framework contains http handlers for web services to support http requested directly. 2) What kind of data can be returned web services web methods? Ans: It supports all the primitive data types and custom data types that can be encoded and serialized by xml. You can find more information about that from the following link. http://msdn.microsoft.com/en-us/library/bb552900.aspx 3) Is web services are only written in asp.net? Ans: No, It can be written by Java and PHP languages also. 4) Explain web method attributes in web services Ans: Web method attributes are added to a public class method to indicate that this method is exposed as a part of XML web services. You can have multiple web methods in a class. But it should be having public attributes as it will be exposed as xml web service part. You can find more information about web method attributes from following link. http://msdn.microsoft.com/en-us/library/byxd99hx(v=vs.71).aspx 5) What is SOA? Ans: SOA stands for “Services Oriented Architecture”. It is kind of service oriented architecture used to support different kind of computing platforms and applications. Web services in asp.net are one of the technologies that supports that kind of architecture.  You can call asp.net web services from any computing platforms and applications. 6) What is SOAP,WDSL and UDDI? Ans: SOAP stands “Simple Object Access protocol”. Web services will be interact with SOAP messages written in XML. SOAP is sometimes referred as “data wrapper” or “data envelope”.Its contains different xml tag that creates a whole SOAP message.  WSDL stand for “Web services Description Language”.  It is an xml document which is written according to standard specified by W3c. It is a kind of manual or document that describes how we can use and consume web service. Web services development software processes the WSDL document and generates SOAP messages that are needed for specific web service. UDDI stand for “Universal Discovery, Description and Integration”. Its is used for web services registries. You can find addresses of web services from UDDI.

    Read the article

  • Start Learning Ruby with IronRuby – Setting up the Environment

    - by kazimanzurrashid
    Recently I have decided to learn Ruby and for last few days I am playing with IronRuby. Learning a new thing is always been a fun and when it comes to adorable language like Ruby it becomes more entertaining. Like any other language, first we have to create the development environment. In order to run IronRuby we have to download the binaries form the IronRuby CodePlex project. IronRuby supports both .NET 2.0 and .NET 4, but .NET 4 is the recommended version, you can download either the installation or the zip file. If you download the zip file make sure you added the bin directory in the environment path variable. Once you are done, open up the command prompt and type : ir –v It should print message like: IronRuby 1.0.0.0 on .NET 4.0.30319.1 The ir is 32bit version of IronRuby, if you want to use 64bit you can try ir64. Next, we have to find a editor where we can write our Ruby code as there is currently no integration story of IronRuby with Visual Studio like its twin Iron Python. Among the free IDEs only SharpDevelop has the IronRuby support but it does not have auto complete or debugging built into it, only thing that it supports is the syntax highlighting, so using a text editor which has the same features is nothing different comparing to it. To play with the IronRuby we will be using Notepad++, which can be downloaded from its sourceforge download page. The Notepad++ does have a nice syntax highlighting support : I am using the Vibrant Ink with some little modification. The next thing we have to do is configure the Notepad++ that we can run the Ruby script in IronRuby inside the Notepad++. Lets create a batch(.bat) file in the IronRuby bin directory, which will have the following content: @echo off  cls call ir %1 pause This will make sure that the console will be paused once we run the script. Now click Run->Run in the Notepad++, it will bring up the run dialog and put the following command in the textbox: riir.bat "$(FULL_CURRENT_PATH)" Click the save which will bring another dialog. Type Iron Ruby and assign the shortcut to ctrl + f5 (Same as Visual Studio Start without Debugging) and click ok. Once you are done you will find the IronRuby in the Run menu. Now press ctrl + f5, we will find the ruby script running in the IronRuby. Now there are one last thing that we would like to add which is poor man’s context sensitive help. First, download the ruby language help file from the Ruby Installer site and extract into a directory. Next we will have to install the Language Help Plug-in of Notepad++, click Plugins->Plugin Manger –>Show Plugin Manager and scroll down until you find the plug-in the list, now check the plug-in and click install. Once it is installed it will prompt you to restart the Notepad++, click yes. When the Notepad++ restarts, click the Plugins –> Language Help –> Options –> add and enter the following details and click ok: The chm file location can be different depending upon where you extracted it. Now when you put your in any of ruby keyword and press ctrl + f1 it will take you to the help topic of that keyword. For example, when my caret is in the each of the following code and I press ctrl + f1, it will take me to the each api doc of Array. def loop_demo (1..10).each{ |n| puts n} end loop_demo That’s it for today. Happy Ruby coding.

    Read the article

  • Announcing the new Oracle Retail Workspace, A Configuration of Oracle WebCenter Spaces 11.1.1.5 for Oracle Retail

    - by Oracle Retail Documentation Team
    For the Oracle Retail 13.2.x enterprise, Oracle Retail Workspace 13.2.4 replaces previous versions of Oracle Retail Workspace. Oracle Retail Workspace 13.2.4 is a supported configuration of Oracle WebCenter Spaces 11.1.1.5 for Oracle Retail. Supported Product Overview In order to provide a next-generation Oracle user engagement platform for the retail industry, Oracle Retail Workspace leverages WebCenter Spaces. Oracle Retail Workspace is not a licensed retail application with any code. Instead, retailers purchase the underlying technology and then leverage the Oracle Retail Workspace Implementation Guide to configure a portal utilizing Oracle WebCenter Spaces. Oracle Retail Workspace has been repositioned as a configuration of Oracle WebCenter Spaces for the following reasons: The Oracle Retail Workspace configuration utilizes the external application functionality and the application navigator taskflow of the Oracle WebCenter Framework to configure Oracle Retail applications in Oracle WebCenter Spaces. The Oracle WebCenter Framework improves IT development cycle times by blending Web 2.0 services, processes, business intelligence, and transactions in an integrated JSF framework. The Oracle WebCenter Spaces 11g offers features provided by the previous versions of Oracle Retail Workspace that enable retailers to leverage a productive portal-based environment. List of Documents The following are included in Workspace 13.2.4, A Configuration of WebCenter Spaces 11.1.1.5 for Oracle Retail Oracle Retail Workspace Release Notes Oracle Retail Workspace Implementation Guide Workspace Retail Library—Unsupported The Oracle Retail Workspace Retail Library is comprised of previously-published accelerator documents and sample code downloads hosted on My Oracle Support. They are not supported, nor are they associated with the support lifecycle of the Workspace application. Doc ID: 1461281.1: Oracle Retail Workspace Retail Library Oracle Retail Workspace Retail Library Reference GuideA set of Micro-Applications that can be used to perform some of the operations of Oracle Retail Merchandising System (RMS) from outside the application. This document describes the functional and technical design details of the Micro-Applications available in this release, including the following and more: Create Regular Item Create Purchase Order Item Transfer Update Vendor Oracle Retail Fashion Planning Bundle Reports documentationThe Oracle Retail Fashion Planning Bundle Reports package includes role-based Oracle Business Intelligence (BI) Enterprise Edition (EE) reports and dashboards that provide an illustrative overview highlighting the Fashion Planning Bundle solutions. These dashboards can be leveraged out-of-the-box or can be used along with the other dashboards and reports that may have already been created to support a specific solution or organizational needs. This package includes dashboards for the Assortment Planning, Item Planning, Item Planning Configured for COE, Merchandise Financial Planning Retail Accounting, and Merchandise Financial Planning Cost Accounting applications. Oracle Retail Accelerators for WebLogic Server 11g Micro-Applications Development TutorialThis tutorial describes how you can create a Micro-Application for the Create a Regular Item task in the Retail Merchandising System (RMS) application using Oracle JDeveloper and ADF. Retail Accelerators: Developing ADF Reports for RPASThis document illustrates how you can use the Oracle Application Development Framework 11g (ADF) to generate reports that provide insights from the Oracle Retail Predictive Application Server (RPAS) based applications. Oracle Retail Accelerators Guide for WebCenter 11gOracle Retail Accelerators Guide for WebCenter 11g describes how you can integrate Oracle Retail applications with Oracle WebCenter Spaces and customize WebCenter Spaces to include custom-developed content. Oracle Retail Accelerators, Developing Oracle BI EE reports on RPAS Domain DataThis document illustrates how you can set up the integration between BI EE and RPAS domains to generate BI EE reports and dashboards for RPAS. Oracle Retail Accelerators, Developing Oracle BI EE Reports on RPAS WorkbooksThis document outlines a process to create real-time Oracle Business Intelligence (BI) Enterprise Edition reports against RPAS workbooks dynamically, as opposed to directly going against the RPAS domain for the data. 

    Read the article

  • SQLAuthority News – Author Visit – SQL Server 2008 R2 Launch

    - by pinaldave
    June 11, 2010 was a wonderful day because I attended the very first SQL Server 2008 R2 Launch event held by Microsoft at Mumbai. I traveled to Mumbai from my home town, Ahmedabad. The event was located at one of the best hotels in Mumbai,”The Leela”. SQL Server R2 Launch was an evening event that had a few interesting talks. SQL PASS is associated with this event as one of the partners and its goal is to increase the awareness of the Community about SQL Server. I met many interesting people and had a great networking opportunity at the event. This event was kicked off with an awesome laser show and a “Welcome” video, which was followed by a Microsoft Executive session wherein there were several interesting demo. The very first demo was about Powerpivot. I knew beforehand that there will be Powerpivot demos because it is a very popular subject; however, I was really hoping to see other interesting demos from SQL Server 2008 R2. And believe me; I was happier to see the later demos. There were demos from SQL Server Utility Control Point, as well an integration of Bing Map with Reporting Servers. I really enjoyed the interactive and informative session by Shivaram Venkatesh. He had excellent presentation skills as well as ample technical knowledge to keep the audience attentive. I really liked his presentations skills wherein he did not read the whole slide deck; rather, he picked one point and using that point he told the story of the whole slide deck. I also enjoyed my conversation with Afaq Choonawala, who is one of the “gem guys” in Microsoft. I also want to acknowledge Ashwin Kini and Mohit Panchal for their excellent support to this event. Mumbai IT Pro is a user group which you can really count on for any kind of help. After excellent demos and a vibrant start of the event, all the audience was jazzed up. There were two vendors’ sessions right after the first session. Intel had 15 minutes to present; however, Intel’s representative, who had good knowledge of the subject, had nearly 30+ slides in his presentation, so he had to rush a bit to cover the whole slide deck. Intel presentations were followed up by another vendor presentation from NetApp. I have previously heard about this tool. After I saw the demo which did not work the first time the Net App presenter demonstrated it, I started to have a doubt on this product. I personally went to clarify my doubt to the demo booth after the presentation was over, but I realize the NetApp presenter or booth owner had absolutely a POOR KNOWLEDGE of SQL Server and even of their own NetApp product. The NetApp people tried to misguide us and when we argued, they started to say different things against what they said earlier. At one point in their presentation, they claimed their application does something very fast, which did not really happen in front of all the audience. They blamed SQL Server R2 DBCC CHECKDB command for their product’s failed demonstration. I know that NetApp has many great products; however, this one was not conveyed clearly and even created a negative impression to all of us. Well, let us not judge the potential, fun, education and enigma of the launch event through a small glitch. This event was jam-packed and extremely well-received by everybody who attended it. As what I said, average demos and good presentations by MS folks were really something to cheer about. Any launch event is considered as successful if it achieves its goal to excite users with its cutting edge technology; just like this event that left a very deep impression on me. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Author Visit, SQLAuthority News, T SQL, Technology Tagged: PASS, SQLPASS

    Read the article

  • Redaction in AutoVue

    - by [email protected]
    As the trend to digitize all paper assets continues, so does the push to digitize all the processes around these assets. One such process is redaction - removing sensitive or classified information from documents. While for some this may conjure up thoughts of old CIA documents filled with nothing but blacked out pages, there are actually many uses for redaction today beyond military and government. Many companies have a need to remove names, phone numbers, social security numbers, credit card numbers, etc. from documents that are being scanned in and/or released to the public or less privileged users - insurance companies, banks and legal firms are a few examples. The process of digital redaction actually isn't that far from the old paper method: Step 1. Find a folder with a big red stamp on it labeled "TOP SECRET" Step 2. Make a copy of that document, since some folks still need to access the original contents Step 3. Black out the text or pages you want to hide Step 4. Release or distribute this new 'redacted' copy So where does a solution like AutoVue come in? Well, we've really been doing all of these things for years! 1. With AutoVue's VueLink integration and iSDK, we can integrate to virtually any content management system and view documents of almost any format with a single click. Finding the document and opening it in AutoVue: CHECK! 2. With AutoVue's markup capabilities, adding filled boxes (or other shapes) around certain text is a no-brainer. You can even leverage AutoVue's powerful APIs to automate the addition of markups over certain text or pre-defined regions using our APIs. Black out the text you want to hide: CHECK! 3. With AutoVue's conversion capabilities, you can 'burn-in' the comments into a new file, either as a TIFF, JPEG or PDF document. Burning-in the redactions avoids slip-ups like the recent (well-publicized) TSA one. Through our tight integrations, the newly created copies can be directly checked into the content management system with no manual intervention. Make a copy of that document: CHECK! 4. Again, leveraging AutoVue's integrations, we can now define rules in the system based on a user's privileges. An 'authorized' user wishing to view the document from the repository will get exactly that - no redactions. An 'unauthorized' user, when requesting to view that same document, can get redirected to open the redacted copy of the same document. Release or distribute the new 'redacted' copy: CHECK! See this movie (WMV format, 2mins, 20secs, no audio) for a quick illustration of AutoVue's redaction capabilities. It shows how redactions can be added based on text searches, manual input or pre-defined templates/regions. Let us know what you think in the comments. And remember - this is all in our flagship AutoVue product - no additional software required!

    Read the article

  • Solaris 11 Launch Blog Carnival Roundup

    - by constant
    Solaris 11 is here! And together with the official launch activities, a lot of Oracle and non-Oracle bloggers contributed helpful and informative blog articles to help your datacenter go to eleven. Here are some notable blog postings, sorted by category for your Solaris 11 blog-reading pleasure: Getting Started/Overview A lot of people speculated that the official launch of Solaris 11 would be on 11/11 (whatever way you want to turn it), but it actually happened two days earlier. Larry Wake himself offers 11 Reasons Why Oracle Solaris 11 11/11 Isn't Being Released on 11/11/11. Then, Larry goes on with a summary: Oracle Solaris 11: The First Cloud OS gives you a short and sweet rundown of what the major new features of Solaris 11 are. Jeff Victor has his own list of What's New in Oracle Solaris 11. A popular Solaris 11 meme is to write a blog post about 11 favourite features: Jim Laurent's 11 Reasons to Love Solaris 11, Darren Moffat's 11 Favourite Solaris 11 Features, Mike Gerdt's 11 of My Favourite Things! are just three examples of "11 Favourite Things..." type blog posts, I'm sure many more will follow... More official overview content for Solaris 11 is available from the Oracle Tech Network Solaris 11 Portal. Also, check out Rick Ramsey's blog post Solaris 11 Resources for System Administrators on the OTN Blog and his secret 5 Commands That Make Solaris Administration Easier post from the OTN Garage. (Automatic) Installation and the Image Packaging System (IPS) The brand new Image Packaging System (IPS) and the Automatic Installer (IPS), together with numerous other install/packaging/boot/patching features are among the most significant improvements in Solaris 11. But before installing, you may wonder whether Solaris 11 will support your particular set of hardware devices. Again, the OTN Garage comes to the rescue with Rick Ramsey's post How to Find Out Which Devices Are Supported By Solaris 11. Included is a useful guide to all the first steps to get your Solaris 11 system up and running. Tim Foster had a whole handful of blog posts lined up for the launch, teaching you everything you need to know about IPS but didn't dare to ask: The IPS System Repository, IPS Self-assembly - Part 1: Overlays and Part 2: Multiple Packages Delivering Configuration. Watch out for more IPS posts from Tim! If installing packages or upgrading your system from the net makes you uneasy, then you're not alone: Jim Laurent will tech you how Building a Solaris 11 Repository Without Network Connection will make your life easier. Many of you have already peeked into the future by installing Solaris 11 Express. If you're now wondering whether you can upgrade or whether a fresh install is necessary, then check out Alan Hargreaves's post Upgrading Solaris 11 Express b151a with support to Solaris 11. The trick is in upgrading your pkg(1M) first. Networking One of the first things to do after installing Solaris 11 (or any operating system for that matter), is to set it up for networking. Solaris 11 comes with the brand new "Network Auto-Magic" feature which can figure out everything by itself. For those cases where you want to exercise a little more control, Solaris 11 left a few people scratching their heads. Fortunately, Tschokko wrote up this cool blog post: Solaris 11 manual IPv4 & IPv6 configuration right after the launch ceremony. Thanks, Tschokko! And Milek points out a long awaited networking feature in Solaris 11 called Solaris 11 - hostmodel, which I know for a fact that many customers have looked forward to: How to "bind" a Solaris 11 system to a specific gateway for specific IP address it is using. Steffen Weiberle teaches us how to tune the Solaris 11 networking stack the proper way: ipadm(1M). No more fiddling with ndd(1M)! Check out his tutorial on Solaris 11 Network Tunables. And if you want to get even deeper into the networking stack, there's nothing better than DTrace. Alan Maguire teaches you in: DTracing TCP Congestion Control how to probe deeply into the Solaris 11 TCP/IP stack, the TCP congestion control part in particular. Don't miss his other DTrace and TCP related blog posts! DTrace And there we are: DTrace, the king of all observability tools. Long time DTrace veteran and co-author of The DTrace book*, Brendan Gregg blogged about Solaris 11 DTrace syscall provider changes. BTW, after you install Solaris 11, check out the DTrace toolkit which is installed by default in /usr/dtrace/DTT. It is chock full of handy DTrace scripts, many of which contributed by Brendan himself! Security Another big theme in Solaris 11, and one that is crucial for the success of any operating system in the Cloud is Security. Here are some notable posts in this category: Darren Moffat starts by showing us how to completely get rid of root: Completely Disabling Root Logins on Solaris 11. With no root user, there's one major entry point less to worry about. But that's only the start. In Immutable Zones on Encrypted ZFS, Darren shows us how to double the security of your services: First by locking them into the new Immutable Zones feature, then by encrypting their data using the new ZFS encryption feature. And if you're still missing sudo from your Linux days, Darren again has a solution: Password (PAM) caching for Solaris su - "a la sudo". If you're wondering how much compute power all this encryption will cost you, you're in luck: The Solaris X86 AESNI OpenSSL Engine will make sure you'll use your Intel's embedded crypto support to its fullest. And if you own a brand new SPARC T4 machine you're even luckier: It comes with its own SPARC T4 OpenSSL Engine. Dan Anderson's posts show how there really is now excuse not to encrypt any more... Developers Solaris 11 has a lot to offer to developers as well. Ali Bahrami has a series of blog posts that cover diverse developer topics: elffile: ELF Specific File Identification Utility, Using Stub Objects and The Stub Proto: Not Just For Stub Objects Anymore to name a few. BTW, if you're a developer and want to shape the future of Solaris 11, then Vijay Tatkar has a hint for you: Oracle (Sun Systems Group) is hiring! Desktop and Graphics Yes, Solaris 11 is a 100% server OS, but it can also offer a decent desktop environment, especially if you are a developer. Alan Coopersmith starts by discussing S11 X11: ye olde window system in today's new operating system, then Calum Benson shows us around What's new on the Solaris 11 Desktop. Even accessibility is a first-class citizen in the Solaris 11 user interface. Peter Korn celebrates: Accessible Oracle Solaris 11 - released! Performance Gone are the days of "Slowaris", when Solaris was among the few OSes that "did the right thing" while others cut corners just to win benchmarks. Today, Solaris continues doing the right thing, and it delivers the right performance at the same time. Need proof? Check out Brian's BestPerf blog with continuous updates from the benchmarking lab, including Recent Benchmarks Using Oracle Solaris 11! Send Me More Solaris 11 Launch Articles! These are just a few of the more interesting blog articles that came out around the Solaris 11 launch, I'm sure there are many more! Feel free to post a comment below if you find a particularly interesting blog post that hasn't been listed so far and share your enthusiasm for Solaris 11! *Affiliate link: Buy cool stuff and support this blog at no extra cost. We both win! var flattr_uid = '26528'; var flattr_tle = 'Solaris 11 Launch Blog Carnival Roundup'; var flattr_dsc = '<strong>Solaris 11 is here!</strong>And together with the official launch activities, a lot of Oracle and non-Oracle bloggers contributed helpful and informative blog articles to help your datacenter <a href="http://en.wikipedia.org/wiki/Up_to_eleven">go to eleven</a>.Here are some notable blog postings, sorted by category for your Solaris 11 blog-reading pleasure:'; var flattr_tag = 'blogging,digest,Oracle,Solaris,solaris,solaris 11'; var flattr_cat = 'text'; var flattr_url = 'http://constantin.glez.de/blog/2011/11/solaris-11-launch-blog-carnival-roundup'; var flattr_lng = 'en_GB'

    Read the article

  • SQLAuthority News – Memories at Anniversary of SQL Wait Stats Book

    - by pinaldave
    SQL Wait Stats About a year ago, I had very proud moment. I had published my second book SQL Server Wait Stats with me as a primary author. It has been a long journey since then. The book got great response and it was widely accepted in the community. It was first of its kind of book written specifically on Wait Stats and Performance. The book was based on my earlier month long series written on the same subject SQL Server Wait Stats. Today, on the anniversary of the book, lots of things come to my mind let me share a few here. Idea behind Blog Series A very common question I often receive is why I wrote a 30 day series on Wait Stats. There were two reasons for it. 1) I have been working with SQL Server for a long time and have troubleshoot more than hundreds of SQL Server which are related to performance tuning. It was a great experience and it taught me a lot of new things. I always documented my experience. After a while I found that I was able to completely rely on my own notes when I was troubleshooting any servers. It is right then I decided to document my experience for the community. 2) While working with wait stats there were a few things, which I thought I knew it well as they were working. However, there was always a fear in the back of mind that what happens if what I believed was incorrect and I was on the wrong path all the time. There was only one way to get it validated. Put it out in front community with my understanding and request further help to improve my understanding. It worked, it worked beautifully. I received plenty of conversations, emails and comments. I refined my content based on various conversations and make it more relevant and near accurate. I guess above two are the major reasons for beginning my journey on writing Wait Stats blog series. Idea behind Book After writing a blog series there was a good amount of request I keep on receiving that I should convert it to eBook or proper book as reading blog posts is great but it goes not give a comprehensive understanding of the subject. The very common feedback from users who were beginning the subject that they will prefer to read it in a structured method. After hearing the feedback for more than 4 months, I decided to write a book based on the blog posts. When I envisioned book, I wanted to make sure this book addresses the wait stats concepts from the fundamentals and fill the gaps of blogs I wrote earlier. Rick Morelan and Joes 2 Pros Team I must acknowledge my co-author Rick Morelan for his unconditional support in writing this book. I had already authored one book before I published this book. The experience to write the book was out of the world. Writing blog posts are much much easier than writing books. The efforts it takes to write a book is 100 times more even though the content is ready. I could have not done it myself if there was not tremendous support of my co-author and editor’s team. We spend days and days researching and discussing various concepts covered in the book. When we were in doubt we reached out to experts as well did a practical reproduction of the scenarios to validate the concepts and claims. After continuous 3 months of hard work we were able to get this book out in the community. September 1st – the lucky day Well, we had to select any day to publish the books. When book was completed in August last week we felt very glad. We all had worked hard and having a sample draft book in hand was feeling like having a newborn baby in our hand. Every time my books are published I feel the same joy which I had when my daughter was born. The feeling of holding a new book in hand is the (almost) same feeling as holding newborn baby. I am excited. For me September 1st has been the luckiest day in mind life. My daughter Shaivi was born on September 1st. Since then every September first has been excellent day and have taken me to the next step in life. I believe anything and everything I do on September 1st it is turning out to be successful and blessed. Rick and I had finished a book in the last week of August. We sent it to the publisher (printer) and asked him to take the book live as soon as possible. We did not decide on any date as we wanted the book to get out as fast as it can. Interesting enough, the publisher/printer selected September 1st for publishing the book. He published the book on 1st September and I knew it at the same time that this book will go next level. Book Model – The Most Beautiful Girl We were done with book. We had no budget left for marketing. Rick and I had a long conversation regarding how to spread the words for the book so it can reach to many people. While we were talking about marketing Rick come up with the idea that we should hire a most beautiful girl around who acknowledge our book and genuinely care for book. It was a difficult task and Rick asked me to find a more beautiful girl. I am a father and the most beautiful girl for me my daughter. This was not a difficult task for me. Rick had given me task to find the most beautiful girl and I just could not think of anyone else than my own daughter. I still do not know what Rick thought about this idea but I had already made up my mind. You can see the detailed blog post here. The Fun Experiments Book Signing Event We had lots of fun moments along this book. We have given away more books to people for free than we have sold them actually. We had done book signing events, contests, and just plain give away when we found people can be benefited from this book. There was never an intention to make money and get rich. We just wanted that more and more people know about this new concept and learn from it. Today when I look back to the earnings there is nothing much we have earned if you talk about dollars. However the best reward which we have received is the satisfaction and love of community. The amount of emails, conversations we have so far received for this book is over thousands. We had fun writing this book, it was indeed a very satisfying journey. I have earned lots of friends while learning and exploring. Availability The book is one year old but still very relevant when it is about performance tuning. It is available at various online book stores. If you have read the book, do let me know what you think of it. Amazon | Kindle | Flipkart | Indiaplaza Reference:  Pinal Dave (http://blog.SQLAuthority.com) Filed under: About Me, Joes 2 Pros, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority, SQLAuthority Book Review, T SQL, Technology

    Read the article

  • Agile Awakenings and the Rules of Agile

    - by Robert May
    For those that care, you can read my history of management and technology to understand why I think I’m qualified to talk about this at all.  It’s boring, so feel free to skip it. Awakenings I first started to play around with the idea of “agile” in 2004 or 2005.  I found a book on the Rational Unified Process that I thought was good, and attempted to implement parts of it.  I thought I was agile, but really, it wasn’t.   I still didn’t understand the concept of a team.  I still wanted to tell the team what to do and how to get it done.  I still thought I was smarter than the team. After that job, I started work on another project and began helping that team.  The first few months were really rough.  We were implementing Scrum, which was relatively new to everyone on the team, and, quite frankly, I was doing a poor job of it.  I was trying to micro-manage every aspect of the teams work, and we were all miserable. The moment of change came when the senior architect bailed on the project.  His comment to me was: “This isn’t Agile.  Where are the stand-ups?  Where are the stories?”  He was dead on, and I finally woke up.  I finally realized that I was the problem!  I wasn’t trusting the team.  I wasn’t helping the team.  I was being a manager. Like many (most?), I was claiming to be Agile and use Scrum, but I wasn’t in fact following the rules Scrum.  Since then, I’ve done a lot of studying, hands on practice, coaching of many different teams, and other learning around Scrum, and I have discovered that Scrum has some rules that must be followed for success, even though the process is about continuous improvement. I’ve been practicing Scrum right for about 4 years now and have helped multiple teams implement it successfully, so what you’re about to get is based on experience, rather than just theory. The Rules of Scrum In my experience, what I’ve found is that most companies that claim to be doing Scrum or Agile are actually NOT doing either.  This stems largely because they think that they can “adopt the rules of Agile that fit their organization.”  Sadly, many of them think that this means they can adopt iterations (sprints) and not much else.  Either that, or they think they can do whatever they want, or were doing before, and call it Scrum.  This is simply not true. Here are some rules that must be followed for you to really be doing Scrum.  I’ll go into detail on each one of these posts in future blog posts and update links here.  My intent is that this will help other teams implementing scrum to see more success. Agile does not allow you to do whatever you want A Product Owner is required A ScrumMaster is required The team must function as a Team, and QA must be part of the team Support from upper management is required A prioritized product backlog is required A prioritized sprint backlog is required Release planning is required Complete spring planning is required Showcases are required Velocity must be measured Retrospectives are required Daily stand-ups are required Visibility is absolutely required For now, I think that’s enough, although I reserve the right to add more.  If you’re breaking any of these rules, you’re probably not doing Scrum.  There are exceptions to these rules, but until you have practiced Scrum for a while, you don’t know what those exceptions are. Breaking the Rules Many teams break these rules because they are the ones that expose the most pain.  Scrum is not Advil.  It’s not intended to mask the pain, its intended to cure it.  Let me explain that analogy a bit more.  Recently, my 7 year old son broke his arm, quite severely (see the X-Ray to the right).  That caused him a great deal of pain.  We went first to one doctor, and after viewing the X-Ray, they determined that there was no way that they’d cast the arm at their location.  It was simply too bad of a break for them to deal with.  They did, however, give him some Advil for the pain and put a splint on his arm to stabilize the broken bones.  Within minutes, he was feeling much better.  Had we been stupid, we could have gone home and he’d have been just as happy as ever . . . until the pain medication wore off or one of his siblings touched the splint.  Then, all of that pain would come right back to the top.  Sure, he could make it go away by just taking more Advil and moving the splint out of the way, but that wasn’t going to fix the problem permanently. We ended up in an emergency room with a doctor who could fix his arm.  However, we were warned that the fix was going to be VERY painful, and it was.  Even with heavy sedation (Propofol), my son was in enough pain that he squirmed and wiggled trying to get his arm away from the doctor.  He had to endure this pain in order to have a functional arm. But the setting wasn’t the end.  He had to have several casts, had to have it re-broken once, since the first setting didn’t take and finally was given a clean bill of health. Agile implementation is much like this story.  Agile was developed as a result of people recognizing that the development methodologies that were currently in place simply were ineffective.  However, the fix to the broken development that’s been festering for many years is not painless.  Many people start Agile thinking that things will be wonderful.  They won’t!  Agile is about visibility, and often, it brings great pain to surface.  It causes all of the missed deadlines, the cowboy coders, the coasters, the micro-managers, the lazy, and all of the other problems that are really part of your development process now to become painfully visible to EVERYONE.  Many people don’t like this exposure.  Agile will make the pain better, but not if you remove the cast (the rules above) prematurely and start breaking the rules that expose the most pain.  The healing will take time and is not instant (like Advil).  Figuring out what the true source of pain and fixing it is very valuable to you, your team, and your company.  Remember as you’re doing this that Agile isn’t the source of the pain, it’s really just exposing it.  Find the source. My recommendation is that ALL of these rules are followed for a minimum of six months, and preferably for an entire year, before you decide to break any of these rules.  Get a few good releases under your belt.  Figure out what your velocity is and start firing as a team.  Chances are, after you see agile really in action, you won’t want to break the rules because you’ll see their value. More Reading Jean Tabaka recently published a list of 78 Things I Have Learned in 6 Years of Agile Coaching.  Highly recommended. Technorati Tags: Agile,Scrum,Rules

    Read the article

  • Fan running continously on HP Pavillion G6 notebook with 12.04.1 LTS, help please?

    - by Ankit
    Fan is running continously on my HP Pavillion G6 notebook with 12.04.1 LTS. My system specifications are:- Ram: 6Gb Graphics Card:- 1 GB (AMD Raedon 64XX). HDD: 540 GB. Please find a list of ACPI errors logs from dmesg as follows:- buffer@ankit:~$ dmesg | grep ACPI -i [ 0.000000] BIOS-e820: 000000009cebf000 - 000000009cfbf000 (ACPI NVS) [ 0.000000] BIOS-e820: 000000009cfbf000 - 000000009cfff000 (ACPI data) [ 0.000000] ACPI: RSDP 00000000000fe020 00024 (v02 HPQOEM) [ 0.000000] ACPI: XSDT 000000009cffe120 00084 (v01 HPQOEM SLIC-MPC 00000001 01000013) [ 0.000000] ACPI: FACP 000000009cffc000 000F4 (v04 HPQOEM SLIC-MPC 00000001 MSFT 01000013) [ 0.000000] ACPI: DSDT 000000009cfec000 0C132 (v01 HP 1670 00000000 MSFT 01000013) [ 0.000000] ACPI: FACS 000000009cf6c000 00040 [ 0.000000] ACPI: ASF! 000000009cffd000 000A5 (v32 HP 1670 00000001 MSFT 01000013) [ 0.000000] ACPI: HPET 000000009cffb000 00038 (v01 HP 1670 00000001 MSFT 01000013) [ 0.000000] ACPI: APIC 000000009cffa000 0008C (v02 HP 1670 00000001 MSFT 01000013) [ 0.000000] ACPI: MCFG 000000009cff9000 0003C (v01 HP 1670 00000001 MSFT 01000013) [ 0.000000] ACPI: SLIC 000000009cfeb000 00176 (v01 HPQOEM SLIC-MPC 00000001 MSFT 01000013) [ 0.000000] ACPI: SSDT 000000009cfea000 00D52 (v01 HP 1670 00001000 MSFT 01000013) [ 0.000000] ACPI: BOOT 000000009cfe8000 00028 (v01 HP 1670 00000001 MSFT 01000013) [ 0.000000] ACPI: ASPT 000000009cfe5000 00034 (v07 HP 1670 00000001 MSFT 01000013) [ 0.000000] ACPI: SSDT 000000009cfe4000 00780 (v01 HP 1670 00003000 INTL 20100121) [ 0.000000] ACPI: SSDT 000000009cfe3000 00996 (v01 HP 1670 00003000 INTL 20100121) [ 0.000000] ACPI: SSDT 000000009cfdd000 0219F (v01 HP 1670 00001000 INTL 20100121) [ 0.000000] ACPI: Local APIC address 0xfee00000 [ 0.000000] ACPI: PM-Timer IO Port: 0x408 [ 0.000000] ACPI: Local APIC address 0xfee00000 [ 0.000000] ACPI: LAPIC (acpi_id[0x01] lapic_id[0x00] enabled) [ 0.000000] ACPI: LAPIC (acpi_id[0x02] lapic_id[0x01] enabled) [ 0.000000] ACPI: LAPIC (acpi_id[0x03] lapic_id[0x02] enabled) [ 0.000000] ACPI: LAPIC (acpi_id[0x04] lapic_id[0x03] enabled) [ 0.000000] ACPI: LAPIC (acpi_id[0x05] lapic_id[0x00] disabled) [ 0.000000] ACPI: LAPIC (acpi_id[0x06] lapic_id[0x00] disabled) [ 0.000000] ACPI: LAPIC (acpi_id[0x07] lapic_id[0x00] disabled) [ 0.000000] ACPI: LAPIC (acpi_id[0x08] lapic_id[0x00] disabled) [ 0.000000] ACPI: IOAPIC (id[0x00] address[0xfec00000] gsi_base[0]) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) [ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) [ 0.000000] ACPI: IRQ0 used by override. [ 0.000000] ACPI: IRQ2 used by override. [ 0.000000] ACPI: IRQ9 used by override. [ 0.000000] Using ACPI (MADT) for SMP configuration information [ 0.000000] ACPI: HPET id: 0x8086a201 base: 0xfed00000 [ 0.005902] ACPI: Core revision 20110623 [ 0.536006] PM: Registering ACPI NVS region at 9cebf000 (1048576 bytes) [ 0.538423] ACPI FADT declares the system doesn't support PCIe ASPM, so disable it [ 0.538429] ACPI: bus type pci registered [ 0.656088] ACPI: Added _OSI(Module Device) [ 0.656094] ACPI: Added _OSI(Processor Device) [ 0.656098] ACPI: Added _OSI(3.0 _SCP Extensions) [ 0.656103] ACPI: Added _OSI(Processor Aggregator Device) [ 0.660335] ACPI: EC: Look up EC in DSDT [ 0.664416] ACPI: Executed 1 blocks of module-level executable AML code [ 0.728303] [Firmware Bug]: ACPI: BIOS _OSI(Linux) query ignored [ 0.729536] ACPI: SSDT 000000009ce70798 00727 (v01 PmRef Cpu0Cst 00003001 INTL 20100121) [ 0.730622] ACPI: Dynamic OEM Table Load: [ 0.730630] ACPI: SSDT (null) 00727 (v01 PmRef Cpu0Cst 00003001 INTL 20100121) [ 0.760829] ACPI: SSDT 000000009ce71a98 00303 (v01 PmRef ApIst 00003000 INTL 20100121) [ 0.761992] ACPI: Dynamic OEM Table Load: [ 0.761998] ACPI: SSDT (null) 00303 (v01 PmRef ApIst 00003000 INTL 20100121) [ 0.792451] ACPI: SSDT 000000009ce6fd98 00119 (v01 PmRef ApCst 00003000 INTL 20100121) [ 0.793521] ACPI: Dynamic OEM Table Load: [ 0.793528] ACPI: SSDT (null) 00119 (v01 PmRef ApCst 00003000 INTL 20100121) [ 0.872981] ACPI: Interpreter enabled [ 0.872992] ACPI: (supports S0 S3 S4 S5) [ 0.873064] ACPI: Using IOAPIC for interrupt routing [ 0.882723] ACPI: EC: GPE = 0x16, I/O: command/status = 0x66, data = 0x62 [ 0.883072] ACPI: No dock devices found. [ 0.883084] PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug [ 0.883882] ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-fe]) [ 0.924187] ACPI: PCI Interrupt Routing Table [\_SB_.PCI0._PRT] [ 0.924509] ACPI: PCI Interrupt Routing Table [\_SB_.PCI0.RP01._PRT] [ 0.924581] ACPI: PCI Interrupt Routing Table [\_SB_.PCI0.RP02._PRT] [ 0.924659] ACPI: PCI Interrupt Routing Table [\_SB_.PCI0.RP03._PRT] [ 0.924758] ACPI: PCI Interrupt Routing Table [\_SB_.PCI0.PEG0._PRT] [ 0.924973] pci0000:00: Requesting ACPI _OSC control (0x1d) [ 0.925064] pci0000:00: ACPI _OSC request failed (AE_ERROR), returned control mask: 0x1d [ 0.925069] ACPI _OSC control for PCIe not granted, disabling ASPM [ 0.930212] ACPI: PCI Interrupt Link [LNKA] (IRQs 1 3 4 5 6 10 *11 12 14 15) [ 0.930327] ACPI: PCI Interrupt Link [LNKB] (IRQs 1 3 4 5 6 10 *11 12 14 15) [ 0.930436] ACPI: PCI Interrupt Link [LNKC] (IRQs 1 3 4 5 6 10 *11 12 14 15) [ 0.930547] ACPI: PCI Interrupt Link [LNKD] (IRQs 1 3 4 5 6 *10 11 12 14 15) [ 0.930655] ACPI: PCI Interrupt Link [LNKE] (IRQs 1 3 4 5 6 10 11 12 14 15) *0, disabled. [ 0.930764] ACPI: PCI Interrupt Link [LNKF] (IRQs 1 3 4 5 6 10 11 12 14 15) *0, disabled. [ 0.930873] ACPI: PCI Interrupt Link [LNKG] (IRQs 1 3 4 5 6 10 *11 12 14 15) [ 0.930979] ACPI: PCI Interrupt Link [LNKH] (IRQs 1 3 4 5 6 10 11 12 14 15) *0, disabled. [ 0.932142] PCI: Using ACPI for IRQ routing [ 0.967119] pnp: PnP ACPI init [ 0.967151] ACPI: bus type pnp registered [ 0.968356] pnp 00:00: Plug and Play ACPI device, IDs PNP0a08 PNP0a03 (active) [ 0.968516] pnp 00:01: Plug and Play ACPI device, IDs PNP0200 (active) [ 0.968586] pnp 00:02: Plug and Play ACPI device, IDs INT0800 (active) [ 0.968818] pnp 00:03: Plug and Play ACPI device, IDs PNP0103 (active) [ 0.968915] pnp 00:04: Plug and Play ACPI device, IDs PNP0c04 (active) [ 0.969206] system 00:05: Plug and Play ACPI device, IDs PNP0c02 (active) [ 0.969293] pnp 00:06: Plug and Play ACPI device, IDs PNP0b00 (active) [ 0.969418] pnp 00:07: Plug and Play ACPI device, IDs PNP0303 (active) [ 0.969528] pnp 00:08: Plug and Play ACPI device, IDs SYN1e3f SYN1e00 SYN0002 PNP0f13 (active) [ 0.969969] system 00:09: Plug and Play ACPI device, IDs PNP0c02 (active) [ 0.970574] system 00:0a: Plug and Play ACPI device, IDs PNP0c01 (active) [ 0.970617] pnp: PnP ACPI: found 11 devices [ 0.970622] ACPI: ACPI bus type pnp unregistered [ 1.138064] ACPI: Deprecated procfs I/F for AC is loaded, please retry with CONFIG_ACPI_PROCFS_POWER cleared [ 1.138331] ACPI: AC Adapter [ACAD] (off-line) [ 1.139068] ACPI: Lid Switch [LID0] [ 1.139176] ACPI: Power Button [PWRB] [ 1.139286] ACPI: Power Button [PWRF] [ 1.144637] ACPI: Thermal Zone [TZ01] (0 C) [ 1.144677] ACPI: Deprecated procfs I/F for battery is loaded, please retry with CONFIG_ACPI_PROCFS_POWER cleared [ 1.144693] ACPI: Battery Slot [BAT0] (battery present) [ 1.206926] ACPI: Battery Slot [BAT0] (battery present) [ 13.176993] acpi device:1a: registered as cooling_device4 [ 13.179931] acpi device:1b: registered as cooling_device5 [ 13.180221] ACPI: Video Device [VGA] (multi-head: yes rom: no post: no) [ 13.219589] acpi device:20: registered as cooling_device6 [ 13.220851] ACPI: Video Device [GFX0] (multi-head: yes rom: no post: no) [ 1649.915134] i8042 aux 00:08: wake-up capability disabled by ACPI [ 1649.915147] i8042 kbd 00:07: wake-up capability enabled by ACPI [ 1650.931028] r8169 0000:03:00.0: wake-up capability enabled by ACPI [ 1650.954743] ehci_hcd 0000:00:1d.0: wake-up capability enabled by ACPI [ 1650.978733] ehci_hcd 0000:00:1a.0: wake-up capability enabled by ACPI [ 1651.010950] ACPI: Preparing to enter system sleep state S3 [ 1652.251505] ACPI: Low-level resume complete [ 1652.360953] ACPI: Waking up from system sleep state S3 [ 1652.427581] ehci_hcd 0000:00:1a.0: wake-up capability disabled by ACPI [ 1652.435579] ehci_hcd 0000:00:1d.0: wake-up capability disabled by ACPI [ 1652.437887] r8169 0000:03:00.0: wake-up capability disabled by ACPI [ 1652.506660] i8042 kbd 00:07: wake-up capability disabled by ACPI [ 1661.238234] ACPI Error: No handler for Region [CMS0] (ffff8801d5035558) [SystemCMOS] (20110623/evregion-373) [ 1661.238253] ACPI Error: Region SystemCMOS (ID=5) has no handler (20110623/exfldio-292) [ 1661.238268] ACPI Error: Method parse/execution failed [\_SB_.PCI0.LPCB.EC0_._Q33] (Node ffff8801d5054de8), AE_NOT_EXIST (20110623/psparse-536) [ 3151.784288] i8042 aux 00:08: wake-up capability disabled by ACPI [ 3151.784301] i8042 kbd 00:07: wake-up capability enabled by ACPI [ 3152.797676] r8169 0000:03:00.0: wake-up capability enabled by ACPI [ 3152.821379] ehci_hcd 0000:00:1d.0: wake-up capability enabled by ACPI [ 3152.845367] ehci_hcd 0000:00:1a.0: wake-up capability enabled by ACPI [ 3152.877600] ACPI: Preparing to enter system sleep state S3 [ 3154.313213] ACPI: Low-level resume complete [ 3154.422297] ACPI: Waking up from system sleep state S3 [ 3154.489692] ehci_hcd 0000:00:1a.0: wake-up capability disabled by ACPI [ 3154.497667] ehci_hcd 0000:00:1d.0: wake-up capability disabled by ACPI [ 3154.505947] r8169 0000:03:00.0: wake-up capability disabled by ACPI [ 3154.568985] i8042 kbd 00:07: wake-up capability disabled by ACPI [ 3162.745149] ACPI Error: No handler for Region [CMS0] (ffff8801d5035558) [SystemCMOS] (20110623/evregion-373) [ 3162.745168] ACPI Error: Region SystemCMOS (ID=5) has no handler (20110623/exfldio-292) [ 3162.745183] ACPI Error: Method parse/execution failed [\_SB_.PCI0.LPCB.EC0_._Q33] (Node ffff8801d5054de8), AE_NOT_EXIST (20110623/psparse-536) [ 6775.723501] ACPI Error: No handler for Region [CMS0] (ffff8801d5035558) [SystemCMOS] (20110623/evregion-373) [ 6775.723519] ACPI Error: Region SystemCMOS (ID=5) has no handler (20110623/exfldio-292) [ 6775.723535] ACPI Error: Method parse/execution failed [\_SB_.PCI0.LPCB.EC0_._Q33] (Node ffff8801d5054de8), AE_NOT_EXIST (20110623/psparse-536) [10388.004760] ACPI Error: No handler for Region [CMS0] (ffff8801d5035558) [SystemCMOS] (20110623/evregion-373) [10388.004778] ACPI Error: Region SystemCMOS (ID=5) has no handler (20110623/exfldio-292) [10388.004801] ACPI Error: Method parse/execution failed [\_SB_.PCI0.LPCB.EC0_._Q33] (Node ffff8801d5054de8), AE_NOT_EXIST (20110623/psparse-536) [10723.591930] i8042 aux 00:08: wake-up capability disabled by ACPI [10723.591942] i8042 kbd 00:07: wake-up capability enabled by ACPI [10724.607624] r8169 0000:03:00.0: wake-up capability enabled by ACPI [10724.631349] ehci_hcd 0000:00:1d.0: wake-up capability enabled by ACPI [10724.655339] ehci_hcd 0000:00:1a.0: wake-up capability enabled by ACPI [10724.687572] ACPI: Preparing to enter system sleep state S3 [10726.123176] ACPI: Low-level resume complete [10726.232181] ACPI: Waking up from system sleep state S3 [10726.303653] ehci_hcd 0000:00:1a.0: wake-up capability disabled by ACPI [10726.311648] ehci_hcd 0000:00:1d.0: wake-up capability disabled by ACPI [10726.315734] r8169 0000:03:00.0: wake-up capability disabled by ACPI [10726.379287] i8042 kbd 00:07: wake-up capability disabled by ACPI [10734.393523] ACPI Error: No handler for Region [CMS0] (ffff8801d5035558) [SystemCMOS] (20110623/evregion-373) [10734.393542] ACPI Error: Region SystemCMOS (ID=5) has no handler (20110623/exfldio-292) [10734.393557] ACPI Error: Method parse/execution failed [\_SB_.PCI0.LPCB.EC0_._Q33] (Node ffff8801d5054de8), AE_NOT_EXIST (20110623/ps Continuous sound from the fan is very annoying, any help would highly appreciated.

    Read the article

  • SQLAuthority News – Pluralsight Course Review – Practices for Software Startups – Part 2 of 2

    - by pinaldave
    This is the second part of the two part series of Practices for Software Startup Pluralsight Course. Please read the first part of this series over here. The course is written by Stephen Forte (Blog | Twitter). Stephen Forte is the Chief Strategy Officer of the venture backed company, Telerik. Personal Learning Schedule After these three sessions it was 6:30 am and time to do my own blog.  But for the rest of the day, I kept thinking about the course, and wanted to go back and finish.  I was wishing that I had woken up at 3 am so I could finish all at one go.  All day long I was digesting what I had learned.  At 10 pm, after my daughter had gone to bed, I sighed on again.  I was not disappointed by the long wait.  As I mentioned before, Stephen has started four to six companies, and all of them are very successful today. Here is the video I promised yesterday – it discusses the importance of Right Sizing Your Startup. The Heartbeat of Startup – Technology Stephen has combined all technology knowledge into one 30 minute session.  He discussed  how to start your project, how to deal with opinions, and how to deal with multiple ideas – every start up has multiple directions it can go. He spent a lot of time emphasized deciding which direction to go and how to decide which will be the best for you.  He called it a continuous development cycle. One of the biggest hazards for a start-up company is one person deciding the direction the company will go, until down the road another team member announces that there is a glitch in their part of the work and that everyone will have to start over.  Even though a team of two or five people can move quickly, often the decision has gone too long and cannot be easily fixed.   Stephen used an example from his own life:  he was biased for one type of technology, and his teammate for another.  In the end they opted for his teammate’s  choice , and in the end it was a good decision, even though he was unfamiliar with that particular program.  He argues that technology should not be a barrier to progress, that you cannot rely on your experience only.  This really spoke to me because I am a big fan of SQL, but I know there is more out there, and I should be more open to it.  I give my thanks to Stephen, I learned something in this module besides startups. Money, Success and Epic Win! The longest, but most interesting, the module was funding your start-up.  You need to fund the start-up right at the very beginning, if not done right you will run into trouble.  The good news is that a few years ago start-ups required a lot more money – think millions of dollars – but now start-ups can get off the ground for thousands.  Stephen used an example of a company that years ago would have needed a million dollars, but today could be started for $600.  It is true that things have changed, but you still need money.  For $600 you can start small and add dynamically, as needed.  But the truth is that if you have $600, $6000, or $6 million, it will be spent.  Don’t think of it as trying to save money, think of it as investing in your future.   You will need money, and you will need to (quickly) decide what you do with the money: shares, stakeholders, investing in a team, hiring a CEO.  This is so important because once you have money and start the company, the company IS your money.  It is your biggest currency – having a percentage of ownership in the company.  Investors will want percentages as repayment for their investment, and they will want a say in the business as well.  You will have to decide how far you will dilute your shares, and how the company will be divided, if at all.  If you don’t plan in advance, you will find that after gaining three or four investors, suddenly you are the minority owner in your own dream.  You need to understand funding carefully.  This single module is worth all the money you would have spent on the whole course alone.  I encourage everyone to listen to this single module even if they don’t watch any of the others.     Press End to Start the Game – Exists! The final module is exit strategies.  You did all this work, dealt with all political and legal issues.  What are you going to get out of it? The answer is simple: money.  Maybe you want your company to be bought out, for you talent to bring you a profit.  You can sell the company to someone and still head it.  Many options are available.  You could sell and still work as an employee but no longer own the company.  There are many exit strategies.  This is where all your hard work comes into play.  It is important not to feel fooled at any step.  There are so many good ideas that end up in the garbage because of poor planning, so that if you find yourself successful, you don’t want to blow it at this step!  The exit is important.  I thought that this aspect of the course was completely unique, and I loved Stephen’s point of view.  I was lost deep in thought after this module ended.  I actually took two hours worth of notes on this section alone – and it was only a three hour course.  I am planning on attending this course one more time next week, just to catch up on all the small bits of wisdom I’m sure I missed. Thank you Stephen for bringing your real world experience with us!  I recommend that everyone attends this course, even if they don’t want to begin their own start-up company. It was indeed a long day for me. Do not forget to read part 1 of this story and attend course Practices for Software Startup Pluralsight Course. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Best Practices, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – 2012 – All Download Links in Single Page – SQL Server 2012

    - by pinaldave
    SQL Server 2012 RTM is just announced and recently I wrote about all the SQL Server 2012 Certification on single page. As a feedback, I received suggestions to have a single page where everything about SQL Server 2012 is listed. I will keep this page updated as new updates are announced. Microsoft SQL Server 2012 Evaluation Microsoft SQL Server 2012 enables a cloud-ready information platform that will help organizations unlock breakthrough insights across the organization. Microsoft SQL Server 2012 Express Microsoft SQL Server 2012 Express is a powerful and reliable free data management system that delivers a rich and reliable data store for lightweight Web Sites and desktop applications. Microsoft SQL Server 2012 Feature Pack The Microsoft SQL Server 2012 Feature Pack is a collection of stand-alone packages which provide additional value for Microsoft SQL Server 2012. Microsoft SQL Server 2012 Report Builder Report Builder provides a productive report-authoring environment for IT professionals and power users. It supports the full capabilities of SQL Server 2012 Reporting Services. Microsoft SQL Server 2012 Master Data Services Add-in For Microsoft Excel The Master Data Services Add-in for Excel gives multiple users the ability to update master data in a familiar tool without compromising the data’s integrity in Master Data Services. Microsoft SQL Server 2012 Performance Dashboard Reports The SQL Server 2012 Performance Dashboard Reports are Reporting Services report files designed to be used with the Custom Reports feature of SQL Server Management Studio. Microsoft SQL Server 2012 PowerPivot for Microsoft Excel® 2010 Microsoft PowerPivot for Microsoft Excel 2010 provides ground-breaking technology; fast manipulation of large data sets, streamlined integration of data, and the ability to effortlessly share your analysis through Microsoft SharePoint. Microsoft SQL Server 2012 Reporting Services Add-in for Microsoft SharePoint Technologies 2010 The SQL Server 2012 Reporting Services Add-in for Microsoft SharePoint 2010 technologies allows you to integrate your reporting environment with the collaborative SharePoint 2010 experience. Microsoft SQL Server 2012 Semantic Language Statistics The Semantic Language Statistics Database is a required component for the Statistical Semantic Search feature in Microsoft SQL Server 2012 Semantic Language Statistics. Microsoft ®SQL Server 2012 FileStream Driver – Windows Logo Certification Catalog file for Microsoft SQL Server 2012 FileStream Driver that is certified for WindowsServer 2008 R2. It meets Microsoft standards for compatibility and recommended practices with the Windows Server 2008 R2 operating systems. Microsoft SQL Server StreamInsight 2.0 Microsoft StreamInsight is Microsoft’s Complex Event Processing technology to help businesses create event-driven applications and derive better insights by correlating event streams from multiple sources with near-zero latency. Microsoft JDBC Driver 4.0 for SQL Server Download the Microsoft JDBC Driver 4.0 for SQL Server, a Type 4 JDBC driver that provides database connectivity through the standard JDBC application program interfaces (APIs) available in Java Platform, Enterprise Edition 5 and 6. Data Quality Services Performance Best Practices Guide This guide focuses on a set of best practices for optimizing performance of Data Quality Services (DQS). Microsoft Drivers 3.0 for SQL Server for PHP The Microsoft Drivers 3.0 for SQL Server for PHP provide connectivity to Microsoft SQLServer from PHP applications. Product Documentation for Microsoft SQL Server 2012 for firewall and proxy restricted environments The Microsoft SQL Server 2012 setup installs only the Help Viewer…install any documentation. All of the SQL Server documentation is available online. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • 5 Steps to getting started with IronRuby

    - by Eric Nelson
    IronRuby is a Open Source implementation of the Ruby programming language for .NET, heavily relying on Microsoft's Dynamic Language Runtime. The project's #1 goal is to be a true Ruby implementation, meaning it runs existing Ruby code. Check out this summary of using the Ruby standard library and 3rd party libraries in IronRuby. IronRuby has tight integration with .NET, so any .NET types can be used from IronRuby and the IronRuby runtime can be embedded into any .NET application. These 5 steps should get you nicely up and running on IronRuby – OR … you could just watch a video session from the lead developer which took place earlier this month (March 2010 - 60mins). But the 5 steps will be quicker :-) Step 1 – Install IronRuby :-) You can install IronRuby automatically using an MSI or manually. For simplicity I would recommend the MSI install. TIP: As of the 25th of March IronRuby has not quite shipped. The download above is a Release Candidate (RC) which means it is still undergoing final testing by the team. You will need to uninstall this version (RC3) once the final release is available. The good news is that uninstalling IronRuby RC3 will work without a hitch as the MSI does relatively little. Step 2 – Install an IronRuby friendly editor You will need to Install an editor to work with IronRuby as there is no designer support for IronRuby inside Visual Studio. There are many editors to choose from but I would recommend you either went with: SciTE (Download the MSI): This is a lightweight text editor which is simple to get up and running. SciTE understands Ruby syntax and allows you to easily run IronRuby code within the editor with a small change to the config file. SharpDevelop 3.2 (Download the MSI): This is an open source development environment for C#, VB, Boo and now IronRuby. IronRuby support is new but it does include integrated debugging. You might also want to check out the main site for SharpDevelop. TIP: There are commercial tools for Ruby development which offer richer support such as intellisense.. They can be coerced into working with IronRuby. A good one to start with is RubyMine which needs some small changes to make it work with IronRuby. Step 3 – Run the IronRuby Tutorial Run through the IronRuby tutorial which is included in the IronRuby download. It covers off the basics of the Ruby languages and how IronRuby integrates with .NET. In a typical install it will end up at C:\Program Files\IronRuby 0.9.4.0\Samples\Tutorial. Which will give you the tutorial implemented in .NET and Ruby. TIP: You might also want to check out these two introductory posts Using IronRuby and .NET to produce the ‘Hello World of WPF’ and What's IronRuby, and how do I put it on Rails? Step 4 – Get some good books to read Get a great book on Ruby and IronRuby. There are several free ebooks on Ruby which will help you learn the language. The little book of Ruby is a good place to start. I would also recommend you purchase IronRuby Unleashed (Buy on Amazon UK | Buy on Amazon USA). You might also want to check out this mini-review. Other books are due out soon including IronRuby in Action. TIP: Also check out the official documentation for using .NET from IronRuby. Step 5 – Keep an eye on the team blogs Keep an eye on the IronRuby team blogs including Jimmy Schementi, Jim Deville and Tomas Matousek (full list) TIP: And keep a watch out for the final release of IronRuby – due anytime soon!

    Read the article

  • Visual Studio &amp; TFS &ndash; List of addins, extensions, patches and hotfixes &ndash; Latest and Greatest

    - by terje
    This post is a list of the addins and extensions we (I ) recommend for use in Inmeta.  It’s coming up all the time – what to install, where are the download sites, etc etc, and thus I thought it better to post it here and keep it updated. The basics are Visual Studio 2010 connected to a Team Foundation Server 2010.  The edition of Visual Studio I use is the Ultimate Edition, but as many stay with the Premium Edition I’ve marked the extensions which only works with the Ultimate with a . I’ve also split the group into Recommended (which means Required) and Optional (which means Recommended) and Nice to Have (which means Optional) .   The focus is to get a setup which can be used for a complete coding experience for the whole ALM process.  The Code Gallery is found either through the Tools/Extension Manager menu in Visual Studio or through this link. The ones to really download is the Recommended category.  Then consider the Optional based on your needs.  The list of course reflects what I use for my work , so it is by no means complete, and for some of the tools there are equally useful alternatives.  The components directly associated with Visual Studio from Microsoft should be common, see the Microsoft column.     Product Available on Code Gallery Latest Version License Rec/Opt/N2H Applicable to Microsoft TFS Power Tools Sept 2010 Complete setup msi on link, split into parts on CG Sept 2010 Free Recommended TFS integration Yes Productivity Power Tools Yes 10.0.11019.3 Free Recommended Coding Yes Code Contracts No 1.4.30903 Free Recommended Coding & Quality Yes Code Contracts Editor Extensions Yes 1.4.30903 Free Recommended Coding & Quality Yes VSCommands Yes 3.6.4.1 Lite version Free (Good enough) Nice to have Coding No Power Commands Yes 1.0.2.3 Free Recommended Coding Yes FeaturePack 2   No.  MSDN Subscriber download under Visual Studio 2010 FP2 Part of MSDN Subscription Recommended Modeling & Testing Yes ReSharper No (Trial only) 5.1.1 Licensed Recommended Coding & Quality No dotTrace No 4.0.1 Licensed Optional Quality No NDepends No (Trial only) Licensed Optional Quality No tangible T4 editor Yes 1.950 Lite version Free (Good enough) Optional Coding (T4 templates) No Reflector No (Trial of Pro version only) 6.5 Lite version Free (Good enough) Recommended Coding/Investigation No LinqPad No 4.26.2 Licensed Nice to have Coding No Beyond Compare No 3.1.11 Licensed Recommended Coding/Investigation No Pex and Moles No (Moles available alone on CG) . Complete on MSDN Subscriber download under Visual Studio 2010 0.94.51023 Part of MSDN Subscription Optional Coding & Unit Testing Yes ApexSQL No Licensed Nice to have SQL No                 Some important Patches, upgrades and fixes Product Date Information Rec/Opt Applicable to Scrolling context menu KB2345133 and KB2413613 October 2010 Here Recommended Visual Studio MTM Patch October 2010 Here and here  KB2387011 Recommended (if you use MTM) MTM Data warehouse fix June 2010 Iteration dates fails with SQL 2008 R2.  KB2222312. Affects Burndown chart in Agile workbook Only for SQL 2008 R2 Server Upgrade 2008 to 2010 issue and hotfix August 2010 Fixes problems with labels and branches which are lost during upgrade. Apply before upgrade. Note: This has been fixed in the latest re-release of the TFS Server dated Aug 5th 2010. See here. Recommends downloading the latest bits. Only if upgrade from 2008 from earlier bits Server

    Read the article

  • A Guided Tour of Complexity

    - by JoshReuben
    I just re-read Complexity – A Guided Tour by Melanie Mitchell , protégé of Douglas Hofstadter ( author of “Gödel, Escher, Bach”) http://www.amazon.com/Complexity-Guided-Tour-Melanie-Mitchell/dp/0199798109/ref=sr_1_1?ie=UTF8&qid=1339744329&sr=8-1 here are some notes and links:   Evolved from Cybernetics, General Systems Theory, Synergetics some interesting transdisciplinary fields to investigate: Chaos Theory - http://en.wikipedia.org/wiki/Chaos_theory – small differences in initial conditions (such as those due to rounding errors in numerical computation) yield widely diverging outcomes for chaotic systems, rendering long-term prediction impossible. System Dynamics / Cybernetics - http://en.wikipedia.org/wiki/System_Dynamics – study of how feedback changes system behavior Network Theory - http://en.wikipedia.org/wiki/Network_theory – leverage Graph Theory to analyze symmetric  / asymmetric relations between discrete objects Algebraic Topology - http://en.wikipedia.org/wiki/Algebraic_topology – leverage abstract algebra to analyze topological spaces There are limits to deterministic systems & to computation. Chaos Theory definitely applies to training an ANN (artificial neural network) – different weights will emerge depending upon the random selection of the training set. In recursive Non-Linear systems http://en.wikipedia.org/wiki/Nonlinear_system – output is not directly inferable from input. E.g. a Logistic map: Xt+1 = R Xt(1-Xt) Different types of bifurcations, attractor states and oscillations may occur – e.g. a Lorenz Attractor http://en.wikipedia.org/wiki/Lorenz_system Feigenbaum Constants http://en.wikipedia.org/wiki/Feigenbaum_constants express ratios in a bifurcation diagram for a non-linear map – the convergent limit of R (the rate of period-doubling bifurcations) is 4.6692016 Maxwell’s Demon - http://en.wikipedia.org/wiki/Maxwell%27s_demon - the Second Law of Thermodynamics has only a statistical certainty – the universe (and thus information) tends towards entropy. While any computation can theoretically be done without expending energy, with finite memory, the act of erasing memory is permanent and increases entropy. Life & thought is a counter-example to the universe’s tendency towards entropy. Leo Szilard and later Claude Shannon came up with the Information Theory of Entropy - http://en.wikipedia.org/wiki/Entropy_(information_theory) whereby Shannon entropy quantifies the expected value of a message’s information in bits in order to determine channel capacity and leverage Coding Theory (compression analysis). Ludwig Boltzmann came up with Statistical Mechanics - http://en.wikipedia.org/wiki/Statistical_mechanics – whereby our Newtonian perception of continuous reality is a probabilistic and statistical aggregate of many discrete quantum microstates. This is relevant for Quantum Information Theory http://en.wikipedia.org/wiki/Quantum_information and the Physics of Information - http://en.wikipedia.org/wiki/Physical_information. Hilbert’s Problems http://en.wikipedia.org/wiki/Hilbert's_problems pondered whether mathematics is complete, consistent, and decidable (the Decision Problem – http://en.wikipedia.org/wiki/Entscheidungsproblem – is there always an algorithm that can determine whether a statement is true).  Godel’s Incompleteness Theorems http://en.wikipedia.org/wiki/G%C3%B6del's_incompleteness_theorems  proved that mathematics cannot be both complete and consistent (e.g. “This statement is not provable”). Turing through the use of Turing Machines (http://en.wikipedia.org/wiki/Turing_machine symbol processors that can prove mathematical statements) and Universal Turing Machines (http://en.wikipedia.org/wiki/Universal_Turing_machine Turing Machines that can emulate other any Turing Machine via accepting programs as well as data as input symbols) that computation is limited by demonstrating the Halting Problem http://en.wikipedia.org/wiki/Halting_problem (is is not possible to know when a program will complete – you cannot build an infinite loop detector). You may be used to thinking of 1 / 2 / 3 dimensional systems, but Fractal http://en.wikipedia.org/wiki/Fractal systems are defined by self-similarity & have non-integer Hausdorff Dimensions !!!  http://en.wikipedia.org/wiki/List_of_fractals_by_Hausdorff_dimension – the fractal dimension quantifies the number of copies of a self similar object at each level of detail – eg Koch Snowflake - http://en.wikipedia.org/wiki/Koch_snowflake Definitions of complexity: size, Shannon entropy, Algorithmic Information Content (http://en.wikipedia.org/wiki/Algorithmic_information_theory - size of shortest program that can generate a description of an object) Logical depth (amount of info processed), thermodynamic depth (resources required). Complexity is statistical and fractal. John Von Neumann’s other machine was the Self-Reproducing Automaton http://en.wikipedia.org/wiki/Self-replicating_machine  . Cellular Automata http://en.wikipedia.org/wiki/Cellular_automaton are alternative form of Universal Turing machine to traditional Von Neumann machines where grid cells are locally synchronized with their neighbors according to a rule. Conway’s Game of Life http://en.wikipedia.org/wiki/Conway's_Game_of_Life demonstrates various emergent constructs such as “Glider Guns” and “Spaceships”. Cellular Automatons are not practical because logical ops require a large number of cells – wasteful & inefficient. There are no compilers or general program languages available for Cellular Automatons (as far as I am aware). Random Boolean Networks http://en.wikipedia.org/wiki/Boolean_network are extensions of cellular automata where nodes are connected at random (not to spatial neighbors) and each node has its own rule –> they demonstrate the emergence of complex  & self organized behavior. Stephen Wolfram’s (creator of Mathematica, so give him the benefit of the doubt) New Kind of Science http://en.wikipedia.org/wiki/A_New_Kind_of_Science proposes the universe may be a discrete Finite State Automata http://en.wikipedia.org/wiki/Finite-state_machine whereby reality emerges from simple rules. I am 2/3 through this book. It is feasible that the universe is quantum discrete at the plank scale and that it computes itself – Digital Physics: http://en.wikipedia.org/wiki/Digital_physics – a simulated reality? Anyway, all behavior is supposedly derived from simple algorithmic rules & falls into 4 patterns: uniform , nested / cyclical, random (Rule 30 http://en.wikipedia.org/wiki/Rule_30) & mixed (Rule 110 - http://en.wikipedia.org/wiki/Rule_110 localized structures – it is this that is interesting). interaction between colliding propagating signal inputs is then information processing. Wolfram proposes the Principle of Computational Equivalence - http://mathworld.wolfram.com/PrincipleofComputationalEquivalence.html - all processes that are not obviously simple can be viewed as computations of equivalent sophistication. Meaning in information may emerge from analogy & conceptual slippages – see the CopyCat program: http://cognitrn.psych.indiana.edu/rgoldsto/courses/concepts/copycat.pdf Scale Free Networks http://en.wikipedia.org/wiki/Scale-free_network have a distribution governed by a Power Law (http://en.wikipedia.org/wiki/Power_law - much more common than Normal Distribution). They are characterized by hubs (resilience to random deletion of nodes), heterogeneity of degree values, self similarity, & small world structure. They grow via preferential attachment http://en.wikipedia.org/wiki/Preferential_attachment – tipping points triggered by positive feedback loops. 2 theories of cascading system failures in complex systems are Self-Organized Criticality http://en.wikipedia.org/wiki/Self-organized_criticality and Highly Optimized Tolerance http://en.wikipedia.org/wiki/Highly_optimized_tolerance. Computational Mechanics http://en.wikipedia.org/wiki/Computational_mechanics – use of computational methods to study phenomena governed by the principles of mechanics. This book is a great intuition pump, but does not cover the more mathematical subject of Computational Complexity Theory – http://en.wikipedia.org/wiki/Computational_complexity_theory I am currently reading this book on this subject: http://www.amazon.com/Computational-Complexity-Christos-H-Papadimitriou/dp/0201530821/ref=pd_sim_b_1   stay tuned for that review!

    Read the article

  • OBIEE 11.1.1 - Introduction to OBIEE 11g Full Sample App

    - by user809526
    Isn't it nice to discover OBIEE 11g around a nice "How To" catalog of features? to observe OBI and Essbase relationships at work? to discover TimesTen? The OBIEE 11g Full Sample App (FSA) is a comprehensive collection of examples designed to demonstrate the latest Oracle BIEE 11g capabilities and design best practices: Enhanced visualizations as Geo-spacial maps and interactive dashboards, Action Framework,  BI Publisher, Scorecard and Strategy Management, Mobile style sheets, Semantic layer modeling, Multi-source federation, Integration with products such as Essbase, Oracle OLAP, ODM, TimesTen, ODI and more The FSA is intended to be comprehensive, it is big (see CAVEAT below). The FSA is not an Oracle product, it is a good will free deployment of OBIEE/Essbase designed to exemplify OBIEE features, infrastructure and security around the Fusion Middleware components. Its contents and code are distributed free for demonstrative purposes only. It is neither maintained nor supported by Oracle as a licensed product. The OBIEE Full Sample App is independent of the default Sample App that comes with the OBIEE product. BENEFITS The FSA helps as a demonstrator of OBIEE 11g best practices, a tutorial, an environment "Test & Scrap", a SR bench (regression, conflicts), a tuning bench, a quick ready made POC seed for projects, a security options environment, ... The FSA - Is organized around a catalog of functional features - Has been deployed over 1000 times, it should be stable RELEASE The Full Sample App (V107) is bound to OBIEE 11.1.1.5 and Essbase 11.1.2.1 (November 2011). The FSA release dates are independent of the Product GA date (OBIEE). In early December 2011, a new functional Patch (V110) is released. It is easily applied (in less than 15 mins) on top of OBIEE SampleApp 11.1.1.5 (V107). The patch (V110) includes additional functional examples:        1. Web Catalog Statistics Application: Provides detailed insight into your web catalog content, dormant catalog objects, webcat impact analysis for metadata changes and more        2. Data inflation Scripts: A set of simple SQL procedures to quickly inflate SampleApp Fact and Dimension data to millions of records in a few minutes        3. Public Content Extensions Framework: A patching framework for public examples and contributions leveraging SampleApp        4. Additional report examples (including bridge report, external chart integrations) and bug fixes DISTRIBUTION as VBox image (November 2011) The ready made VBox image is designed to run on Virtual Box. It can be converted to VMware (see another BLOG). 1/ http://www.oracle.com/technetwork/middleware/bi-foundation/obiee-samples-167534.html VBox Image Deployment Guide Sampleapp_v107_GA.ovf - VBox image key file The above http URL provides the user:password for the ftp URLs below. 2/ ftp://user:[email protected]/static/SampleAppV107/ 12 "7-zip" files Sampleapp_v107_GA_7_20.7z.001 -> .012 We recommend 7-zip file manager for unzipping (http://www.7-zip.org/). Select Unzip here option, it will create the contents under a directory named "SampleApp_10722". On Windows, it is important to download and save zip file under the root directory (e.g. C:\ or D:\) because of possible long pathnames. 3/ ftp://user:[email protected]/static/SampleAppV107/Unzipped_Version/ 4 files Sampleapp_v107_GA-disk[1234].vmdk Important note: Check the provided checksums (md5sum). Please do it! DISTRIBUTION as Installation files for existing OBI 11.1.1.5 (November 2011) http://www.oracle.com/technetwork/middleware/bi-foundation/obiee-samples-167534.html Install files Deployment Guide SampleApp_10722_1.zip - 198 MB CAVEAT Many computers have RAM chips problems that keep often silent ... until you manipulate big files. It is strongly advised you run some memory check program eg MEMTEST in GRUB boot manager. Running md5sum repeatedly onto the very same big file must be consistent [same result], else a hardware memory problem is suspected. For Virtual Box, you should most likely enable VT-X (Vanderpool) hardware virtualization in BIOS. A free disk space of 80 GB is required to perform safely the VBox image installation. A Virtual Machine of minimum 6 to 7 GB memory fits the needs of combining OBIEE and Essbase execution.

    Read the article

  • JustMock and Moles – A short overview for TDD alpha geeks

    - by RoyOsherove
    People have been lurking near my house, asking me to write something about Moles and JustMock, so I’ll try to be as objective as possible, taking in the fact that I work at Typemock. If I were NOT working at Typemock I’d write: JustMock JustMock tries to be Typemock at so many levels it’s not even funny. Technically they work the same and the API almost looks like it’s a search and replace work based on the Isolator API (awesome compliment!), but JustMock still has too many growing pains and bugs to be usable. Also, JustMock is missing alot of the legacy abilities such as Non public faking, faking all types and various other things that are really needed in real legacy code. Biggest thing (in terms of isolation integration) is that it does not integrate with other profilers such as coverage, NCover etc.) When JustMock comes out of beta, I feel that it should cost about half as Isolator costs, as it currently provides about half the abilities. Moles Moles is an addon of Pex and was originally only intended to work within the Pex environment. It started as a research project and now it’s a power-tool for VS (so it’s a separate install) Now it’s it’s own little stubbing framework. It’s not really an Isolation framework in the classic sense, because it does not provide any kind of API built in to verify object interactions. You have to use manual flags all on your own to do that. It generates two types of classes per assembly: Manual Stubs(just like you’d hand code them) and Mole classes. Each Mole class is a special API to change and break the behavior that the corresponding type. so MDateTime is how you change behavior for DateTime. In that sense the API is al over the place, and it can become highly unreadable and unmentionable over time in your test. Also, the Moles API isn’t really designed to deal with real Legacy code. It only deals with public types and methods. anything internal or private is ignored and you can’t change its behavior. You also can’t control static constructors. That takes about 95% of legacy scenarios out of the picture if that’s what you’re trying to use it for. Personally, I found it hard to get used to the idea of two parallel APIs for different abilities, and when to choose which. and I know this stuff. I would expect more usability from the API to make it more widely used. I don’t think that Moles in planning to go that route. Publishing it as an Isolation framework is really an afterthought of a tool that was design with a specific task in mind, and generic Isolation isn’t it. it’s only hope is DEQ – a simple code example that shows a simple Isolation API built on the Moles generic engine. Moles can and should be used for very simple cases of detouring functionality such a simple static methods or interfaces and virtual functions (like rhinomock and MOQ do).   Oh, Wait. Ah, good thing I work at Typemock. I won’t write all that. I’ll just write: JustMock and Moles are great tools that enlarge the market space for isolation related technologies, and they prove that the idea of productivity and unit testing can go hand in hand and get people hooked. I look forward to compete with them at this growing market.

    Read the article

  • Getting a Database into Source Control

    - by Grant Fritchey
    For any number of reasons, from simple auditing, to change tracking, to automated deployment, to integration with application development processes, you’re going to want to place your database into source control. Using Red Gate SQL Source Control this process is extremely simple. SQL Source Control works within your SQL Server Management Studio (SSMS) interface.  This means you can work with your databases in any way that you’re used to working with them. If you prefer scripts to using the GUI, not a problem. If you prefer using the GUI to having to learn T-SQL, again, that’s fine. After installing SQL Source Control, this is what you’ll see when you open SSMS:   SQL Source Control is now a direct piece of the SSMS environment. The key point initially is that I currently don’t have a database selected. You can even see that in the SQL Source Control window where it shows, in red, “No database selected – select a database in Object Explorer.” If I expand my Databases list in the Object Explorer, you’ll be able to immediately see which databases have been integrated with source control and which have not. There are visible differences between the databases as you can see here:   To add a database to source control, I first have to select it. For this example, I’m going to add the AdventureWorks2012 database to an instance of the SVN source control software (I’m using uberSVN). When I click on the AdventureWorks2012 database, the SQL Source Control screen changes:   I’m going to need to click on the “Link database to source control” text which will open up a window for connecting this database to the source control system of my choice.  You can pick from the default source control systems on the left, or define one of your own. I also have to provide the connection string for the location within the source control system where I’ll be storing my database code. I set these up in advance. You’ll need two. One for the main set of scripts and one for special scripts called Migrations that deal with different kinds of changes between versions of the code. Migrations help you solve problems like having to create or modify data in columns as part of a structural change. I’ll talk more about them another day. Finally, I have to determine if this is an isolated environment that I’m going to be the only one use, a dedicated database. Or, if I’m sharing the database in a shared environment with other developers, a shared database.  The main difference is, under a dedicated database, I will need to regularly get any changes that other developers have made from source control and integrate it into my database. While, under a shared database, all changes for all developers are made at the same time, which means you could commit other peoples work without proper testing. It all depends on the type of environment you work within. But, when it’s all set, it will look like this: SQL Source Control will compare the results between the empty folders in source control and the database, AdventureWorks2012. You’ll get a report showing exactly the list of differences and you can choose which ones will get checked into source control. Each of the database objects is scripted individually. You’ll be able to modify them later in the same way. Here’s the list of differences for my new database:   You can select/deselect all the objects or each object individually. You also get a report showing the differences between what’s in the database and what’s in source control. If there was already a database in source control, you’d only see changes to database objects rather than every single object. You can see that the database objects can be sorted by name, by type, or other choices. I’m going to add a comment such as “Initial creation of database in source control.” And then click on the Commit button which will put all the objects in my database into the source control system. That’s all it takes to get the objects into source control initially. Now is when things can get fun with breaking changes to code, automated deployments, unit testing and all the rest.

    Read the article

  • jQuery Context Menu Plugin and Capturing Right-Click

    - by Ben Griswold
    I was thrilled to find Cory LaViska’s jQuery Context Menu Plugin a few months ago. In very little time, I was able to integrate the context menu with the jQuery Treeview.  I quickly had a really pretty user interface which took full advantage of limited real estate.  And guess what.  As promised, the plugin worked in Chrome, Safari 3, IE 6/7/8, Firefox 2/3 and Opera 9.5.  Everything was perfect and I shipped to the Integration Environment. One thing kept bugging though – right clicks aren’t the standard in a web environment. Sure, when one hovers over the treeview node, the mouse changed from an arrow to a pointer, but without help text most users will certainly left-click rather than right. As I was already doubting the design decision, we did some Mac testing.  The context menu worked in Firefox but not Safari.  Damn.  That’s when I started digging into the Madness of Javascript Mouse Events.  Don’t tell, but it’s complicated.  About as close as one can get to capture the right-click mouse event on all major browsers on Windows and Mac is this: if (event.which == null) /* IE case */ button= (event.button < 2) ? "LEFT" : ((event.button == 4) ? "MIDDLE" : "RIGHT"); else /* All others */ button= (event.which < 2) ? "LEFT" : ((event.which == 2) ? "MIDDLE" : "RIGHT"); Yikes.  The content menu code was simply checking if event.button == 2.  No problem.  Cory offers a jQuery Right Click Plugin which I’m sure works for windows but probably not the Mac either.  (Please note I haven’t verified this.) Anyway, I decided to address my UI design concern and the Safari Mac issue in one swoop.  I decided to make the context menu respond to any mouse click event.  This didn’t take much – especially after seeing how Bill Beckelman updated the library to recognize the left click. First, I added an AnyClick option to the library defaults: // Any click may trigger the dropdown and that's okay // See Javascript Madness: Mouse Events – http: //unixpapa.com/js/mouse.html if (o.anyClick == undefined) o.anyClick = false; And then I trigger the context menu dropdown based on the following conditional: if (evt.button == 2 || o.anyClick) { Nothing tricky about that, right?  Finally, I updated my menu setup to include the AnyClick value, if true: $('.member').contextMenu({ menu: 'memberContextMenu', anyClick: true },             function (action, el, pos) {                 … Now the context menu works in “all” environments if you left, right or even middle click.  Download jQuery Context Menu Plugin for Any Click *Opera 9.5 has an option to allow scripts to detect right-clicks, but it is disabled by default. Furthermore, Opera still doesn’t allow JavaScript to disable the browser’s default context menu which causes a usability conflict.

    Read the article

  • Routing Issue in ASP.NET MVC 3 RC 2

    - by imran_ku07
         Introduction:             Two weeks ago, ASP.NET MVC team shipped the ASP.NET MVC 3 RC 2 release. This release includes some new features and some performance optimization. This release also fixes most of the bugs but still some minor issues are present in this release. Some of these issues are already discussed by Scott Guthrie at Update on ASP.NET MVC 3 RC2 (and a workaround for a bug in it). In addition to these issues, I have found another issue in this release regarding routing. In this article, I will show you the issue regarding routing and a simple workaround for this issue.       Description:             The easiest way to understand an issue is to reproduce it in the application. So create a MVC 2 application and a MVC 3 RC 2 application. Then in both applications, just open global.asax file and update the default route as below,     routes.IgnoreRoute("{resource}.axd/{*pathInfo}"); routes.MapRoute( "Default", // Route name "{controller}/{action}/{id1}/{id2}", // URL with parameters new { controller = "Home", action = "Index", id1 = UrlParameter.Optional, id2 = UrlParameter.Optional } // Parameter defaults );              Then just open Index View and add the following lines,    <%@ Page Language="C#" MasterPageFile="~/Views/Shared/Site.Master" Inherits="System.Web.Mvc.ViewPage" %> <asp:Content ID="Content1" ContentPlaceHolderID="TitleContent" runat="server"> Home Page </asp:Content> <asp:Content ID="Content2" ContentPlaceHolderID="MainContent" runat="server"> <% Html.RenderAction("About"); %> </asp:Content>             The above view will issue a child request to About action method. Now run both applications. ASP.NET MVC 2 application will run just fine. But ASP.NET MVC 3 RC 2 application will throw an exception as shown below,                  You may think that this is a routing issue but this is not the case here as both ASP.NET MVC 2 and ASP.NET MVC  3 RC 2 applications(created above) are built with .NET Framework 4.0 and both will use the same routing defined in System.Web. Something is wrong in ASP.NET MVC 3 RC 2. So after digging into ASP.NET MVC source code, I have found that the UrlParameter class in ASP.NET MVC 3 RC 2 overrides the ToString method which simply return an empty string.     public sealed class UrlParameter { public static readonly UrlParameter Optional = new UrlParameter(); private UrlParameter() { } public override string ToString() { return string.Empty; } }             In MVC 2 the ToString method was not overridden. So to quickly fix the above problem just replace UrlParameter.Optional default value with a different value other than null or empty(for example, a single white space) or replace UrlParameter.Optional default value with a new class object containing the same code as UrlParameter class have except the ToString method is not overridden (or with a overridden ToString method that return a string value other than null or empty). But by doing this you will loose the benefit of ASP.NET MVC 2 Optional URL Parameters. There may be many different ways to fix the above problem and not loose the benefit of optional parameters. Here I will create a new class MyUrlParameter with the same code as UrlParameter class have except the ToString method is not overridden. Then I will create a base controller class which contains a constructor to remove all MyUrlParameter route data parameters, same like ASP.NET MVC doing with UrlParameter route data parameters early in the request.     public class BaseController : Controller { public BaseController() { if (System.Web.HttpContext.Current.CurrentHandler is MvcHandler) { RouteValueDictionary rvd = ((MvcHandler)System.Web.HttpContext.Current.CurrentHandler).RequestContext.RouteData.Values; string[] matchingKeys = (from entry in rvd where entry.Value == MyUrlParameter.Optional select entry.Key).ToArray(); foreach (string key in matchingKeys) { rvd.Remove(key); } } } } public class HomeController : BaseController { public ActionResult Index(string id1) { ViewBag.Message = "Welcome to ASP.NET MVC!"; return View(); } public ActionResult About() { return Content("Child Request Contents"); } }     public sealed class MyUrlParameter { public static readonly MyUrlParameter Optional = new MyUrlParameter(); private MyUrlParameter() { } }     routes.IgnoreRoute("{resource}.axd/{*pathInfo}"); routes.MapRoute( "Default", // Route name "{controller}/{action}/{id1}/{id2}", // URL with parameters new { controller = "Home", action = "Index", id1 = MyUrlParameter.Optional, id2 = MyUrlParameter.Optional } // Parameter defaults );             MyUrlParameter class is a copy of UrlParameter class except that MyUrlParameter class not overrides the ToString method. Note that the default route is modified to use MyUrlParameter.Optional instead of UrlParameter.Optional. Also note that BaseController class constructor is removing MyUrlParameter parameters from the current request route data so that the model binder will not bind these parameters with action method parameters. Now just run the ASP.NET MVC 3 RC 2 application again, you will find that it runs just fine.             In case if you are curious to know that why ASP.NET MVC 3 RC 2 application throws an exception if UrlParameter class contains a ToString method which returns an empty string, then you need to know something about a feature of routing for url generation. During url generation, routing will call the ParsedRoute.Bind method internally. This method includes a logic to match the route and build the url. During building the url, ParsedRoute.Bind method will call the ToString method of the route values(in our case this will call the UrlParameter.ToString method) and then append the returned value into url. This method includes a logic after appending the returned value into url that if two continuous returned values are empty then don't match the current route otherwise an incorrect url will be generated. Here is the snippet from ParsedRoute.Bind method which will prove this statement.       if ((builder2.Length > 0) && (builder2[builder2.Length - 1] == '/')) { return null; } builder2.Append("/"); ........................................................... ........................................................... ........................................................... ........................................................... if (RoutePartsEqual(obj3, obj4)) { builder2.Append(UrlEncode(Convert.ToString(obj3, CultureInfo.InvariantCulture))); continue; }             In the above example, both id1 and id2 parameters default values are set to UrlParameter object and UrlParameter class include a ToString method that returns an empty string. That's why this route will not matched.            Summary:             In this article I showed you the issue regarding routing and also showed you how to workaround this problem. I explained this issue with an example by creating a ASP.NET MVC 2 and a ASP.NET MVC 3 RC 2 application. Finally I also explained the reason for this issue. Hopefully you will enjoy this article too.   SyntaxHighlighter.all()

    Read the article

  • How to remove window applet from Gnome3?

    - by Filip Nowak
    I installed today window applet for Gnome3 from this webupd8 post. The effect of the installation shown in the picture. I tried apt-get remove --purge and nothing happens. How do I remove this window applet? http://i.stack.imgur.com/D1s9b.jpg When i try metacity --replace &unity [1] 3171 Checking if settings need to be migrated ...no Checking if internal files need to be migrated ...no Backend : gconf Integration : true Profile : default Adding plugins Skipping upgrade com.canonical.unity.unity.01.upgrade Skipping upgrade com.canonical.unity.unity.02.upgrade Initializing core options...done Initializing bailer options...done Initializing detection options...done Initializing composite options...done Initializing opengl options...done Initializing decor options...done Initializing move options...done Initializing vpswitch options...done Initializing gnomecompat options...done Initializing grid options...done Initializing mousepoll options...done Initializing place options...done Initializing resize options...done Initializing animation options...done Initializing wall options...done Initializing session options...done Initializing workarounds options...done Initializing wobbly options...done compiz (expo) - Warn: failed to bind image to texture Initializing expo options...done Initializing ezoom options...done Initializing staticswitcher options...done Initializing fade options...done Initializing scale options...done Screen geometry changed: 0x0x1920x1080 Initializing unityshell options...done DEBUG 2012-02-19 21:22:40 glib <unknown>:0 Setting to primary screen rect: x=0 y=0 w=1920 h=1080 WARN 2012-02-19 21:22:40 unity.favorites FavoriteStoreGSettings.cpp:138 Unable to load GDesktopAppInfo for 'bluefish.desktop' WARN 2012-02-19 21:22:40 unity.favorites FavoriteStoreGSettings.cpp:138 Unable to load GDesktopAppInfo for 'filezilla.desktop' WARN 2012-02-19 21:22:40 unity.favorites FavoriteStoreGSettings.cpp:138 Unable to load GDesktopAppInfo for 'gimp.desktop' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' Setting Update "texture_filter" Setting Update "sync_to_vblank" Setting Update "fullscreen_visual_bell" Setting Update "panel_opacity" Setting Update "launcher_opacity" Setting Update "icon_size" WARN 2012-02-19 21:23:32 unity.glib.dbusproxy GLibDBusProxy.cpp:255 Cannot call method InfoRequest proxy /com/canonical/unity/lens/applications does not exist WARN 2012-02-19 21:23:32 unity.glib.dbusproxy GLibDBusProxy.cpp:255 Cannot call method SetActive proxy /com/canonical/unity/lens/applications does not exist WARN 2012-02-19 21:23:32 unity.glib.dbusproxy GLibDBusProxy.cpp:255 Cannot call method InfoRequest proxy /com/canonical/unity/lens/commands does not exist WARN 2012-02-19 21:23:32 unity.glib.dbusproxy GLibDBusProxy.cpp:255 Cannot call method SetActive proxy /com/canonical/unity/lens/commands does not exist WARN 2012-02-19 21:23:32 unity.glib.dbusproxy GLibDBusProxy.cpp:255 Cannot call method InfoRequest proxy /com/canonical/unity/lens/files does not exist WARN 2012-02-19 21:23:32 unity.glib.dbusproxy GLibDBusProxy.cpp:255 Cannot call method SetActive proxy /com/canonical/unity/lens/files does not exist WARN 2012-02-19 21:23:32 unity.glib.dbusproxy GLibDBusProxy.cpp:255 Cannot call method InfoRequest proxy /com/canonical/unity/lens/music does not exist WARN 2012-02-19 21:23:32 unity.glib.dbusproxy GLibDBusProxy.cpp:255 Cannot call method SetActive proxy /com/canonical/unity/lens/music does not exist WARN 2012-02-19 21:23:33 unity.iconloader IconLoader.cpp:509 Unable to load contents of file:///usr/share/icons/unity-icon-theme/places/svg/category-available.svg: Blad podczas otwierania pliku: Nie ma takiego pliku ani katalogu WARN 2012-02-19 21:23:33 unity.iconloader IconLoader.cpp:509 Unable to load contents of file:///usr/share/icons/unity-icon-theme/places/svg/category-installed.svg: Blad podczas otwierania pliku: Nie ma takiego pliku ani katalogu

    Read the article

  • Markus Zirn, "Big Data with CEP and SOA" @ SOA, Cloud &amp; Service Technology Symposium 2012

    - by JuergenKress
    ORACLE PROMOTIONAL DISCOUNT FOR EXCLUSIVE ORACLE DISCOUNT, ENTER PROMO CODE: DJMXZ370 Early-Bird Registration is Now Open with Special Pricing! Register before July 1, 2012 to qualify for discounts. Visit the Registration page for details. The International SOA, Cloud + Service Technology Symposium is a yearly event that features the top experts and authors from around the world, providing a series of keynotes, talks, demonstrations, and panels, as well as training and certification workshops - all dedicated to empowering IT professionals to realize modern service technologies and practices in the real world. Click here for a two-page printable conference overview (PDF). Big Data with CEP and SOA - September 25, 2012 - 14:15 Speaker: Markus Zirn, Oracle and Baz Kuthi, Avocent The "Big Data" trend is driving new kinds of IT projects that process machine-generated data. Such projects store and mine using Hadoop/ Map Reduce, but they also analyze streaming data via event-driven patterns, which can be called "Fast Data" complementary to "Big Data". This session highlights how "Big Data" and "Fast Data" design patterns can be combined with SOA design principles into modern, event-driven architectures. We will describe specific architectures that combines CEP, Distributed Caching, Event-driven Network, SOA Composites, Application Development Framework, as well as Hadoop. Architecture patterns include pre-processing and filtering event streams as close as possible to the event source, in memory master data for event pattern matching, event-driven user interfaces as well as distributed event processing. Focus is on how "Fast Data" requirements are elegantly integrated into a traditional SOA architecture. Markus Zirn is Vice President of Product Management covering Oracle SOA Suite, SOA Governance, Application Integration Architecture, BPM, BPM Solutions, Complex Event Processing and UPK, an end user learning solution. He is the author of “The BPEL Cookbook” (rated best book on Services Oriented Architecture in 2007) as well as “Fusion Middleware Patterns”. Previously, he was a management consultant with Booz Allen & Hamilton’s High Tech practice in Duesseldorf as well as San Francisco and Vice President of Product Marketing at QUIQ. Mr. Zirn holds a Masters of Electrical Engineering from the University of Karlsruhe and is an alumnus of the Tripartite program, a joint European degree from the University of Karlsruhe, Germany, the University of Southampton, UK, and ESIEE, France. KEYNOTES & SPEAKERS More than 80 international subject matter experts will be speaking at the Symposium. Below are confirmed keynotes and speakers so far. Over 50% of the agenda has not yet been finalized. Many more speakers to come. View the partial program calendars on the Conference Agenda page. CONFERENCE THEMES & TRACKS Cloud Computing Architecture & Patterns New SOA & Service-Orientation Practices & Models Emerging Service Technology Innovation Service Modeling & Analysis Techniques Service Infrastructure & Virtualization Cloud-based Enterprise Architecture Business Planning for Cloud Computing Projects Real World Case Studies Semantic Web Technologies (with & without the Cloud) Governance Frameworks for SOA and/or Cloud Computing Projects Service Engineering & Service Programming Techniques Interactive Services & the Human Factor New REST & Web Services Tools & Techniques Oracle Specialized SOA & BPM Partners Oracle Specialized partners have proven their skills by certifications and customer references. To find a local Specialized partner please visit http://solutions.oracle.com SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit  www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Mix Forum Technorati Tags: Markus Zirn,SOA Symposium,Thomas Erl,SOA Community,Oracle SOA,Oracle BPM,BPM Community,OPN,Jürgen Kress

    Read the article

< Previous Page | 161 162 163 164 165 166 167 168 169 170 171 172  | Next Page >