Search Results

Search found 9542 results on 382 pages for 'operating systems'.

Page 379/382 | < Previous Page | 375 376 377 378 379 380 381 382  | Next Page >

  • Laissez les bon temps rouler! (Microsoft BI Conference 2010)

    - by smisner
    "Laissez les bons temps rouler" is a Cajun phrase that I heard frequently when I lived in New Orleans in the mid-1990s. It means "Let the good times roll!" and encapsulates a feeling of happy expectation. As I met with many of my peers and new acquaintances at the Microsoft BI Conference last week, this phrase kept running through my mind as people spoke about their plans in their respective businesses, the benefits and opportunities that the recent releases in the BI stack are providing, and their expectations about the future of the BI stack. Notwithstanding some jabs here and there to point out the platform is neither perfect now nor will be anytime soon (along with admissions that the competitors are also not perfect), and notwithstanding several missteps by the event organizers (which I don't care to enumerate), the overarching mood at the conference was positive. It was a refreshing change from the doom and gloom hovering over several conferences that I attended in 2009. Although many people expect economic hardships to continue over the coming year or so, everyone I know in the BI field is busier than ever and expects to stay busy for quite a while. Self-Service BI Self-service was definitely a theme of the BI conference. In the keynote, Ted Kummert opened with a look back to a fairy tale vision of self-service BI that he told in 2008. At that time, the fairy tale future was a time when "every end user was able to use BI technologies within their job in order to move forward more effectively" and transitioned to the present time in which SQL Server 2008 R2, Office 2010, and SharePoint 2010 are available to deliver managed self-service BI. This set of technologies is presumably poised to address the needs of the 80% of users that Kummert said do not use BI today. He proceeded to outline a series of activities that users ought to be able to do themselves--from simple changes to a report like formatting or an addtional data visualization to integration of an additional data source. The keynote then continued with a series of demonstrations of both current and future technology in support of self-service BI. Some highlights that interested me: PowerPivot, of course, is the flagship product for self-service BI in the Microsoft BI stack. In the TechEd keynote, which was open to the BI conference attendees, Amir Netz (twitter) impressed the audience by demonstrating interactivity with a workbook containing 100 million rows. He upped the ante at the BI keynote with his demonstration of a future-state PowerPivot workbook containing over 2 billion records. It's important to note that this volume of data is being processed by a server engine, and not in the PowerPivot client engine. (Yes, I think it's impressive, but none of my clients are typically wrangling with 2 billion records at a time. Maybe they're thinking too small. This ability to work quickly with large data sets has greater implications for BI solutions than for self-service BI, in my opinion.) Amir also demonstrated KPIs for the future PowerPivot, which appeared to be easier to implement than in any other Microsoft product that supports KPIs, apart from simple KPIs in SharePoint. (My initial reaction is that we have one more place to build KPIs. Great. It's confusing enough. I haven't seen how well those KPIs integrate with other BI tools, which will be important for adoption.) One more PowerPivot feature that Amir showed was a graphical display of the lineage for calculations. (This is hugely practical, especially if you build up calculations incrementally. You can more easily follow the logic from calculation to calculation. Furthermore, if you need to make a change to one calculation, you can assess the impact on other calculations.) Another product demonstration will be available within the next 30 days--Pivot for Reporting Services. If you haven't seen this technology yet, check it out at www.getpivot.com. (It definitely has a wow factor, but I'm skeptical about its practicality. However, I'm looking forward to trying it out with data that I understand.) Michael Tejedor (twitter) demonstrated a feature that I think is really interesting and not emphasized nearly enough--overshadowed by PowerPivot, no doubt. That feature is the Microsoft Business Intelligence Indexing Connector, which enables search of the content of Excel workbooks and Reporting Services reports. (This capability existed in MOSS 2007, but was more cumbersome to implement. The search results in SharePoint 2010 are not only cooler, but more useful by describing whether the content is found in a table or a chart, for example.) This may yet be the dawning of the age of self-service BI - a phrase I've heard repeated from time to time over the last decade - but I think BI professionals are likely to stay busy for a long while, and need not start looking for a new line of work. Kummert repeatedly referenced strategic BI solutions in contrast to self-service BI to emphasize that self-service BI is not a replacement for the services that BI professionals provide. After all, self-service BI does not appear magically on user desktops (or whatever device they want to use). A supporting infrastructure is necessary, and grows in complexity in proportion to the need to simplify BI for users. It's one thing to hear the party line touted by Microsoft employees at the BI keynote, but it's another to hear from the people who are responsible for implementing and supporting it within an organization. Rob Collie (blog | twitter), Kasper de Jonge (blog | twitter), Vidas Matelis (site | twitter), and I were invited to join Andrew Brust (blog | twitter) as he led a Birds of a Feather session at TechEd entitled "PowerPivot: Is It the BI Deal-Changer for Developers and IT Pros?" I would single out the prevailing concern in this session as the issue of control. On one side of this issue were those who were concerned that they would lose control once PowerPivot is implemented. On the other side were those who believed that data should be freely accessible to users in PowerPivot, and even acknowledgment that users would get the data they want even if it meant they would have to manually enter into a workbook to have it ready for analysis. For another viewpoint on how PowerPivot played out at the conference, see Rob Collie's observations. Collaborative BI I have been intrigued by the notion of collaborative BI for a very long time. Before I discovered BI, I was a Lotus Notes developer and later a manager of developers, working in a software company that enabled collaboration in the legal industry. Not only did I help create collaborative systems for our clients, I created a complete project management from the ground up to collaboratively manage our custom development work. In that case, collaboration involved my team, my client contacts, and me. I was also able to produce my own BI from that system as well, but didn't know that's what I was doing at the time. Only in recent years has SharePoint begun to catch up with the capabilities that I had with Lotus Notes more than a decade ago. Eventually, I had the opportunity at that job to formally investigate BI as another product offering for our software, and the rest - as they say - is history. I built my first data warehouse with Scott Cameron (who has also ventured into the authoring world by writing Analysis Services 2008 Step by Step and was at the BI Conference last week where I got to reminisce with him for a bit) and that began a career that I never imagined at the time. Fast forward to 2010, and I'm still lauding the virtues of collaborative BI, if only the tools will catch up to my vision! Thus, I was anxious to see what Donald Farmer (blog | twitter) and Rita Sallam of Gartner had to say on the subject in their session "Collaborative Decision Making." As I suspected, the tools aren't quite there yet, but the vendors are moving in the right direction. One thing I liked about this session was a non-Microsoft perspective of the state of the industry with regard to collaborative BI. In addition, this session included a better demonstration of SharePoint collaborative BI capabilities than appeared in the BI keynote. Check out the video in the link to the session to see the demonstration. One of the use cases that was demonstrated was linking from information to a person, because, as Donald put it, "People don't trust data, they trust people." The Microsoft BI Stack in General A question I hear all the time from students when I'm teaching is how to know what tools to use when there is overlap between products in the BI stack. I've never taken the time to codify my thoughts on the subject, but saw that my friend Dan Bulos provided good insight on this topic from a variety of perspectives in his session, "So Many BI Tools, So Little Time." I thought one of his best points was that ideally you should be able to design in your tool of choice, and then deploy to your tool of choice. Unfortunately, the ideal is yet to become real across the platform. The closest we come is with the RDL in Reporting Services which can be produced from two different tools (Report Builder or Business Intelligence Development Studio's Report Designer), manually, or by a third-party or custom application. I have touted the idea for years (and publicly said so about 5 years ago) that eventually more products would be RDL producers or consumers, but we aren't there yet. Maybe in another 5 years. Another interesting session that covered the BI stack against a backdrop of competitive products was delivered by Andrew Brust. Andrew did a marvelous job of consolidating a lot of information in a way that clearly communicated how various vendors' offerings compared to the Microsoft BI stack. He also made a particularly compelling argument about how the existence of an ecosystem around the Microsoft BI stack provided innovation and opportunities lacking for other vendors. Check out his presentation, "How Does the Microsoft BI Stack...Stack Up?" Expo Hall I had planned to spend more time in the Expo Hall to see who was doing new things with the BI stack, but didn't manage to get very far. Each time I set out on an exploratory mission, I got caught up in some fascinating conversations with one or more of my peers. I find interacting with people that I meet at conferences just as important as attending sessions to learn something new. There were a couple of items that really caught me eye, however, that I'll share here. Pragmatic Works. Whether you develop SSIS packages, build SSAS cubes, or author SSRS reports (or all of the above), you really must take a look at BI Documenter. Brian Knight (twitter) walked me through the key features, and I must say I was impressed. Once you've seen what this product can do, you won't want to document your BI projects any other way. You can download a free single-user database edition, or choose from more feature-rich standard or professional editions. Microsoft Press ebooks. I also stopped by the O'Reilly Media booth to meet some folks that one of my acquisitions editors at Microsoft Press recommended. In case you haven't heard, Microsoft Press has partnered with O'Reilly Media for distribution and publishing. Apart from my interest in learning more about O'Reilly Media as an author, an advertisement in their booth caught me eye which I think is a really great move. When you buy Microsoft Press ebooks through the O'Reilly web site, you can receive it in any (or all) of the following formats where possible: PDF, epub, .mobi for Kindle and .apk for Android. You also have lifetime DRM-free access to the ebooks. As someone who is an avid collector of books, I fnd myself running out of room for storage. In addition, I travel a lot, and it's hard to lug my reference library with me. Today's e-reader options make the move to digital books a more viable way to grow my library. Having a variety of formats means I am not limited to a single device, and lifetime access means I don't have to worry about keeping track of where I've stored my files. Because the e-books are DRM-free, I can copy and paste when I'm compiling notes, and I can print pages when necessary. That's a winning combination in my mind! Overall, I was pleased with the BI conference. There were many more sessions that I couldn't attend, either because the room was full when I got there or there were multiple sessions running concurrently that I wanted to see. Fortunately, many of the sessions are accessible for viewing online at http://www.msteched.com/2010/NorthAmerica along with the TechEd sessions. You can spot the BI sessions by the yellow skyline on the title slide of the presentation as shown below. Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Oracle BI Server Modeling, Part 1- Designing a Query Factory

    - by bob.ertl(at)oracle.com
      Welcome to Oracle BI Development's BI Foundation blog, focused on helping you get the most value from your Oracle Business Intelligence Enterprise Edition (BI EE) platform deployments.  In my first series of posts, I plan to show developers the concepts and best practices for modeling in the Common Enterprise Information Model (CEIM), the semantic layer of Oracle BI EE.  In this segment, I will lay the groundwork for the modeling concepts.  First, I will cover the big picture of how the BI Server fits into the system, and how the CEIM controls the query processing. Oracle BI EE Query Cycle The purpose of the Oracle BI Server is to bridge the gap between the presentation services and the data sources.  There are typically a variety of data sources in a variety of technologies: relational, normalized transaction systems; relational star-schema data warehouses and marts; multidimensional analytic cubes and financial applications; flat files, Excel files, XML files, and so on. Business datasets can reside in a single type of source, or, most of the time, are spread across various types of sources. Presentation services users are generally business people who need to be able to query that set of sources without any knowledge of technologies, schemas, or how sources are organized in their company. They think of business analysis in terms of measures with specific calculations, hierarchical dimensions for breaking those measures down, and detailed reports of the business transactions themselves.  Most of them create queries without knowing it, by picking a dashboard page and some filters.  Others create their own analysis by selecting metrics and dimensional attributes, and possibly creating additional calculations. The BI Server bridges that gap from simple business terms to technical physical queries by exposing just the business focused measures and dimensional attributes that business people can use in their analyses and dashboards.   After they make their selections and start the analysis, the BI Server plans the best way to query the data sources, writes the optimized sequence of physical queries to those sources, post-processes the results, and presents them to the client as a single result set suitable for tables, pivots and charts. The CEIM is a model that controls the processing of the BI Server.  It provides the subject areas that presentation services exposes for business users to select simplified metrics and dimensional attributes for their analysis.  It models the mappings to the physical data access, the calculations and logical transformations, and the data access security rules.  The CEIM consists of metadata stored in the repository, authored by developers using the Administration Tool client.     Presentation services and other query clients create their queries in BI EE's SQL-92 language, called Logical SQL or LSQL.  The API simply uses ODBC or JDBC to pass the query to the BI Server.  Presentation services writes the LSQL query in terms of the simplified objects presented to the users.  The BI Server creates a query plan, and rewrites the LSQL into fully-detailed SQL or other languages suitable for querying the physical sources.  For example, the LSQL on the left below was rewritten into the physical SQL for an Oracle 11g database on the right. Logical SQL   Physical SQL SELECT "D0 Time"."T02 Per Name Month" saw_0, "D4 Product"."P01  Product" saw_1, "F2 Units"."2-01  Billed Qty  (Sum All)" saw_2 FROM "Sample Sales" ORDER BY saw_0, saw_1       WITH SAWITH0 AS ( select T986.Per_Name_Month as c1, T879.Prod_Dsc as c2,      sum(T835.Units) as c3, T879.Prod_Key as c4 from      Product T879 /* A05 Product */ ,      Time_Mth T986 /* A08 Time Mth */ ,      FactsRev T835 /* A11 Revenue (Billed Time Join) */ where ( T835.Prod_Key = T879.Prod_Key and T835.Bill_Mth = T986.Row_Wid) group by T879.Prod_Dsc, T879.Prod_Key, T986.Per_Name_Month ) select SAWITH0.c1 as c1, SAWITH0.c2 as c2, SAWITH0.c3 as c3 from SAWITH0 order by c1, c2   Probably everybody reading this blog can write SQL or MDX.  However, the trick in designing the CEIM is that you are modeling a query-generation factory.  Rather than hand-crafting individual queries, you model behavior and relationships, thus configuring the BI Server machinery to manufacture millions of different queries in response to random user requests.  This mass production requires a different mindset and approach than when you are designing individual SQL statements in tools such as Oracle SQL Developer, Oracle Hyperion Interactive Reporting (formerly Brio), or Oracle BI Publisher.   The Structure of the Common Enterprise Information Model (CEIM) The CEIM has a unique structure specifically for modeling the relationships and behaviors that fill the gap from logical user requests to physical data source queries and back to the result.  The model divides the functionality into three specialized layers, called Presentation, Business Model and Mapping, and Physical, as shown below. Presentation services clients can generally only see the presentation layer, and the objects in the presentation layer are normally the only ones used in the LSQL request.  When a request comes into the BI Server from presentation services or another client, the relationships and objects in the model allow the BI Server to select the appropriate data sources, create a query plan, and generate the physical queries.  That's the left to right flow in the diagram below.  When the results come back from the data source queries, the right to left relationships in the model show how to transform the results and perform any final calculations and functions that could not be pushed down to the databases.   Business Model Think of the business model as the heart of the CEIM you are designing.  This is where you define the analytic behavior seen by the users, and the superset library of metric and dimension objects available to the user community as a whole.  It also provides the baseline business-friendly names and user-readable dictionary.  For these reasons, it is often called the "logical" model--it is a virtual database schema that persists no data, but can be queried as if it is a database. The business model always has a dimensional shape (more on this in future posts), and its simple shape and terminology hides the complexity of the source data models. Besides hiding complexity and normalizing terminology, this layer adds most of the analytic value, as well.  This is where you define the rich, dimensional behavior of the metrics and complex business calculations, as well as the conformed dimensions and hierarchies.  It contributes to the ease of use for business users, since the dimensional metric definitions apply in any context of filters and drill-downs, and the conformed dimensions enable dashboard-wide filters and guided analysis links that bring context along from one page to the next.  The conformed dimensions also provide a key to hiding the complexity of many sources, including federation of different databases, behind the simple business model. Note that the expression language in this layer is LSQL, so that any expression can be rewritten into any data source's query language at run time.  This is important for federation, where a given logical object can map to several different physical objects in different databases.  It is also important to portability of the CEIM to different database brands, which is a key requirement for Oracle's BI Applications products. Your requirements process with your user community will mostly affect the business model.  This is where you will define most of the things they specifically ask for, such as metric definitions.  For this reason, many of the best-practice methodologies of our consulting partners start with the high-level definition of this layer. Physical Model The physical model connects the business model that meets your users' requirements to the reality of the data sources you have available. In the query factory analogy, think of the physical layer as the bill of materials for generating physical queries.  Every schema, table, column, join, cube, hierarchy, etc., that will appear in any physical query manufactured at run time must be modeled here at design time. Each physical data source will have its own physical model, or "database" object in the CEIM.  The shape of each physical model matches the shape of its physical source.  In other words, if the source is normalized relational, the physical model will mimic that normalized shape.  If it is a hypercube, the physical model will have a hypercube shape.  If it is a flat file, it will have a denormalized tabular shape. To aid in query optimization, the physical layer also tracks the specifics of the database brand and release.  This allows the BI Server to make the most of each physical source's distinct capabilities, writing queries in its syntax, and using its specific functions. This allows the BI Server to push processing work as deep as possible into the physical source, which minimizes data movement and takes full advantage of the database's own optimizer.  For most data sources, native APIs are used to further optimize performance and functionality. The value of having a distinct separation between the logical (business) and physical models is encapsulation of the physical characteristics.  This encapsulation is another enabler of packaged BI applications and federation.  It is also key to hiding the complex shapes and relationships in the physical sources from the end users.  Consider a routine drill-down in the business model: physically, it can require a drill-through where the first query is MDX to a multidimensional cube, followed by the drill-down query in SQL to a normalized relational database.  The only difference from the user's point of view is that the 2nd query added a more detailed dimension level column - everything else was the same. Mappings Within the Business Model and Mapping Layer, the mappings provide the binding from each logical column and join in the dimensional business model, to each of the objects that can provide its data in the physical layer.  When there is more than one option for a physical source, rules in the mappings are applied to the query context to determine which of the data sources should be hit, and how to combine their results if more than one is used.  These rules specify aggregate navigation, vertical partitioning (fragmentation), and horizontal partitioning, any of which can be federated across multiple, heterogeneous sources.  These mappings are usually the most sophisticated part of the CEIM. Presentation You might think of the presentation layer as a set of very simple relational-like views into the business model.  Over ODBC/JDBC, they present a relational catalog consisting of databases, tables and columns.  For business users, presentation services interprets these as subject areas, folders and columns, respectively.  (Note that in 10g, subject areas were called presentation catalogs in the CEIM.  In this blog, I will stick to 11g terminology.)  Generally speaking, presentation services and other clients can query only these objects (there are exceptions for certain clients such as BI Publisher and Essbase Studio). The purpose of the presentation layer is to specialize the business model for different categories of users.  Based on a user's role, they will be restricted to specific subject areas, tables and columns for security.  The breakdown of the model into multiple subject areas organizes the content for users, and subjects superfluous to a particular business role can be hidden from that set of users.  Customized names and descriptions can be used to override the business model names for a specific audience.  Variables in the object names can be used for localization. For these reasons, you are better off thinking of the tables in the presentation layer as folders than as strict relational tables.  The real semantics of tables and how they function is in the business model, and any grouping of columns can be included in any table in the presentation layer.  In 11g, an LSQL query can also span multiple presentation subject areas, as long as they map to the same business model. Other Model Objects There are some objects that apply to multiple layers.  These include security-related objects, such as application roles, users, data filters, and query limits (governors).  There are also variables you can use in parameters and expressions, and initialization blocks for loading their initial values on a static or user session basis.  Finally, there are Multi-User Development (MUD) projects for developers to check out units of work, and objects for the marketing feature used by our packaged customer relationship management (CRM) software.   The Query Factory At this point, you should have a grasp on the query factory concept.  When developing the CEIM model, you are configuring the BI Server to automatically manufacture millions of queries in response to random user requests. You do this by defining the analytic behavior in the business model, mapping that to the physical data sources, and exposing it through the presentation layer's role-based subject areas. While configuring mass production requires a different mindset than when you hand-craft individual SQL or MDX statements, it builds on the modeling and query concepts you already understand. The following posts in this series will walk through the CEIM modeling concepts and best practices in detail.  We will initially review dimensional concepts so you can understand the business model, and then present a pattern-based approach to learning the mappings from a variety of physical schema shapes and deployments to the dimensional model.  Along the way, we will also present the dimensional calculation template, and learn how to configure the many additivity patterns.

    Read the article

  • You should NOT be writing jQuery in SharePoint if&hellip;

    - by Mark Rackley
    Yes… another one of these posts. What can I say? I’m a pot stirrer.. a rabble rouser *rabble rabble* jQuery in SharePoint seems to be a fairly polarizing issue with one side thinking it is the most awesome thing since Princess Leia as the slave girl in Return of the Jedi and the other half thinking it is the worst idea since Mannequin 2: On the Move. The correct answer is OF COURSE “it depends”. But what are those deciding factors that make jQuery an awesome fit or leave a bad taste in your mouth? Let’s see if I can drive the discussion here with some polarizing comments of my own… I know some of you are getting ready to leave your comments even now before reading the rest of the blog, which is great! Iron sharpens iron… These discussions hopefully open us up to understanding the entire process better and think about things in a different way. You should not be writing jQuery in SharePoint if you are not a developer… Let’s start off with my most polarizing and rant filled portion of the blog post. If you don’t know what you are doing or you don’t have a background that helps you understand the implications of what you are writing then you should not be writing jQuery in SharePoint! I truly believe that one of the biggest reasons for the jQuery haters is because of all the bad jQuery out there. If you don’t know what you are doing you can do some NASTY things! One of the best stories I’ve heard about this is from my good friend John Ferringer (@ferringer). John tells this story during our Mythbusters session we do together. One of his clients was undergoing a Denial of Service attack and they couldn’t figure out what was going on! After much searching they found that some genius jQuery developer wrote some code for an image rotator, but did not take into account what happens when there are no images to load! The code just kept hitting the servers over and over and over again which prevented anything else from getting done! Now, I’m NOT saying that I have not done the same sort of thing in the past or am immune from such mistakes. My point is that if you don’t know what you are doing, there are very REAL consequences that can have a major impact on your organization AND they will be hard to track down.  Think how happy your boss will be after you copy and pasted some jQuery from a blog without understanding what it does, it brings down the farm, AND it takes them 3 days to track it back to you.  :/ Good times will not be had. Like it or not JavaScript/jQuery is a programming language. While you .NET people sit on your high horses because your code is compiled and “runs faster” (also debatable), the rest of us will be actually getting work done and delivering solutions while you are trying to figure out why your widget won’t deploy. I can pick at that scab because I write .NET code too and speak from experience. I can do both, and do both well. So, I am not speaking from ignorance here. In JavaScript/jQuery you have variables, loops, conditionals, functions, arrays, events, and built in methods. If you are not a developer you just aren’t going to take advantage of all of that and use it correctly. Ahhh.. but there is hope! There is a lot of jQuery resources out there to help you learn and learn well! There are many experts on the subject that will gladly tell you when you are smoking crack. I just this minute saw a tweet from @cquick with a link to: “jQuery Fundamentals”. I just glanced through it and this may be a great primer for you aspiring jQuery devs. Take advantage of all the resources and become a developer! Hey, it will look awesome on your resume right? You should not be writing jQuery in SharePoint if it depends too much on client resources for a good user experience I’ve said it once and I’ll say it over and over until you understand. jQuery is executed on the client’s computer. Got it? If you are looping through hundreds of rows of data, searching through an enormous DOM, or performing many calculations it is going to take some time! AND if your user happens to be sitting on some old PC somewhere that they picked up at a garage sale their experience will be that much worse! If you can’t give the user a good experience they will not use the site. So, if jQuery is causing the user to have a bad experience, don’t use it. I sometimes go as far to say that you should NOT go to jQuery as a first option for external facing web sites because you have ZERO control over what the end user’s computer will be. You just can’t guarantee an awesome user experience all of the time. Ahhh… but you have no choice? (where have I heard that before?). Well… if you really have no choice, here are some tips to help improve the experience: Avoid screen scraping This is not 1999 and SharePoint is not an old green screen from a mainframe… so why are you treating it like it is? Screen scraping is time consuming and client intensive. Take advantage of tools like SPServices to do your data retrieval when possible. Fine tune your DOM searches A lot of time can be eaten up just searching the DOM and ignoring table rows that you don’t need. Write better jQuery to only loop through tables rows that you need, or only access specific elements you need. Take advantage of Element ID’s to return the one element you are looking for instead of looping through all the DOM over and over again. Write better jQuery Remember this is development. Think about how you can write cleaner, faster jQuery. This directly relates to the previous point of improving your DOM searches, but also when using arrays, variables and loops. Do you REALLY need to loop through that array 3 times? How can you knock it down to 2 times or even 1? When you have lots of calculations and data that you are manipulating every operation adds up. Think about how you can streamline it. Back in the old days before RAM was abundant, Cores were plentiful and dinosaurs roamed the earth, us developers had to take performance into account in everything we did. It’s a lost art that really needs to be used here. You should not be writing jQuery in SharePoint if you are sending a lot of data over the wire… Developer:  “Awesome… you can easily call SharePoint’s web services to retrieve and write data using SPServices!” Administrator: “Crap! you can easily call SharePoint’s web services to retrieve and write data using SPServices!” SPServices may indeed be the best thing that happened to SharePoint since the invention of SharePoint Saturdays by Godfather Lotter… BUT you HAVE to use it wisely! (I REFUSE to make the Spiderman reference). If you do not know what you are doing your code will bring back EVERY field and EVERY row from a list and push that over the internet with all that lovely XML wrapped around it. That can be a HUGE amount of data and will GREATLY impact performance! Calling several web service methods at the same time can cause the same problem and can negatively impact your SharePoint servers. These problems, thankfully, are not difficult to rectify if you are careful: Limit list data retrieved Use CAML to reduce the number of rows returned and limit the fields returned using ViewFields.  You should definitely be doing this regardless. If you aren’t I hope your admin thumps you upside the head. Batch large list updates You may or may not have noticed that if you try to do large updates (hundreds of rows) that the performance is either completely abysmal or it fails over half the time. You can greatly improve performance and avoid timeouts by breaking up your updates into several smaller updates. I don’t know if there is a magic number for best performance, it really depends on how much data you are sending back more than the number of rows. However, I have found that 200 rows generally works well.  Play around and find the right number for your situation. Delay Web Service calls when possible One of the cool things about jQuery and SPServices is that you can delay queries to the server until they are actually needed instead of doing them all at once. This can lead to performance improvements over DataViewWebParts and even .NET code in the right situations. So, don’t load the data until it’s needed. In some instances you may not need to retrieve the data at all, so why retrieve it ALL the time? You should not be writing jQuery in SharePoint if there is a better solution… jQuery is NOT the silver bullet in SharePoint, it is not the answer to every question, it is just another tool in the developers toolkit. I urge all developers to know what options exist out there and choose the right one! Sometimes it will be jQuery, sometimes it will be .NET,  sometimes it will be XSL, and sometimes it will be some other choice… So, when is there a better solution to jQuery? When you can’t get away from performance problems Sometimes jQuery will just give you horrible performance regardless of what you do because of unavoidable obstacles. In these situations you are going to have to figure out an alternative. Can I do it with a DVWP or do I have to crack open Visual Studio? When you need to do something that jQuery can’t do There are lots of things you can’t do in jQuery like elevate privileges, event handlers, workflows, or interact with back end systems that have no web service interface. It just can’t do everything. When it can be done faster and more efficiently another way Why are you spending time to write jQuery to do a DataViewWebPart that would take 5 minutes? Or why are you trying to implement complicated logic that would be simple to do in .NET? If your answer is that you don’t have the option, okay. BUT if you do have the option don’t reinvent the wheel! Take advantage of the other tools. The answer is not always jQuery… sorry… the kool-aid tastes good, but sweet tea is pretty awesome too. You should not be using jQuery in SharePoint if you are a moron… Let’s finish up the blog on a high note… Yes.. it’s true, I sometimes type things just to get a reaction… guess this section title might be a good example, but it feels good sometimes just to type the words that a lot of us think… So.. don’t be that guy! Another good buddy of mine that works for Microsoft told me. “I loved jQuery in SharePoint…. until I had to support it.”. He went on to explain that some user was making several web service calls on a page using jQuery and then was calling Microsoft and COMPLAINING because the page took so long to load… DUH! What do you expect to happen when you are pushing that much data over the wire and are making that many web service calls at once!! It’s one thing to write that kind of code and accept it’s just going to take a while, it’s COMPLETELY another issue to do that and then complain when it’s not lightning fast!  Someone’s gene pool needs some chlorine. So, I think this is a nice summary of the blog… DON’T be that guy… don’t be a moron. How can you stop yourself from being a moron? Ah.. glad you asked, here are some tips: Think Is jQuery the right solution to my problem? Is there a better approach? What are the implications and pitfalls of using jQuery in this situation? Search What are others doing? Does someone have a better solution? Is there a third party library that does the same thing I need? Plan Write good jQuery. Limit calculations and data sent over the wire and don’t reinvent the wheel when possible. Test Okay, it works well on your machine. Try it on others ESPECIALLY if this is for an external site. Test with empty data. Test with hundreds of rows of data. Test as many scenarios as possible. Monitor those server resources to see the impact there as well. Ask the experts As smart as you are, there are people smarter than you. Even the experts talk to each other to make sure they aren't doing something stupid. And for the MOST part they are pretty nice guys. Marc Anderson and Christophe Humbert are two guys who regularly keep me in line. Make sure you aren’t doing something stupid. Repeat So, when you think you have the best solution possible, repeat the steps above just to be safe.  Conclusion jQuery is an awesome tool and has come in handy on many occasions. I’m even teaching a 1/2 day SharePoint & jQuery workshop at the upcoming SPTechCon in Boston if you want to berate me in person. However, it’s only as awesome as the developer behind the keyboard. It IS development and has its pitfalls. Knowledge and experience are invaluable to giving the user the best experience possible.  Let’s face it, in the end, no matter our opinions, prejudices, or ego providing our clients, customers, and users with the best solution possible is what counts. Period… end of sentence…

    Read the article

  • Using Oracle Proxy Authentication with JPA (eclipselink-Style)

    - by olaf.heimburger
    Security is a very intriguing topic. You will find it everywhere and you need to implement it everywhere. Yes, you need. Unfortunately, one can easily forget it while implementing the last mile. The Last Mile In a multi-tier application it is a common practice to use connection pools between the business layer and the database layer. Connection pools are quite useful to speed database connection creation and to split the load. Another very common practice is to use a specific, often called technical, user to connect to the database. This user has authentication and authorization rules that apply to all application users. Imagine you've put every effort to define roles for different types of users that use your application. These roles are necessary to differentiate between normal users, premium users, and administrators (I bet you will find or already have more roles in your application). While these user roles are pretty well used within your application, once the flow of execution enters the database everything is gone. Each and every user just has one role and is the same database user. Issues? What Issues? As long as things go well, this is not a real issue. However, things do not go well all the time. Once your application becomes famous performance decreases in certain situations or, more importantly, current and upcoming regulations and laws require that your application must be able to apply different security measures on a per user role basis at every stage of your application. If you only have a bunch of users with the same name and role you are not able to find the application usage profile that causes the performance issue, or which user has accessed data that he/she is not allowed to. Another thread to your role concept is that databases tend to be used by different applications and tools. These tools can be developer tools like SQL*Plus, SQL Developer, etc. or end user applications like BI Publisher, Oracle Forms and so on. These tools have no idea of your applications role concept and access the database the way they think is appropriate. A big oversight for your perfect role model and a big nightmare for your Chief Security Officer. Speaking of the CSO, brings up another issue: Password management. Once your technical user account is compromised, every user is able to do things that he/she is not expected to do from the design of your application. Counter Measures In the Oracle world a common counter measure is to use Virtual Private Database (VPD). This restricts the values a database user can see to the allowed minimum. However, it doesn't help in regard of a connection pool user, because this one is still not the real user. Oracle Proxy Authentication Another feature of the Oracle database is Proxy Authentication. First introduced with version 9i it is a quite useful feature for nearly every situation. The main idea behind Proxy Authentication is, to create a crippled database user who has only connect rights. Even if this user is compromised the risks are well understood and fairly limited. This user can be used in every situation in which you need to connect to the database, no matter which tool or application (see above) you use.The proxy user is perfect for multi-tier connection pools. CREATE USER app_user IDENTIFIED BY abcd1234; GRANT CREATE SESSION TO app_user; But what if you need to access real data? Well, this is the primary use case, isn't it? Now is the time to bring the application's role concept into play. You define database roles that define the grants for your identified user groups. Once you have these groups you grant access through the proxy user with the application role to the specific user. CREATE ROLE app_role_a; GRANT app_role_a TO scott; ALTER USER scott GRANT CONNECT THROUGH app_user WITH ROLE app_role_a; Now, hr has permission to connect to the database through the proxy user. Through the role you can restrict the hr's rights the are needed for the application only. If hr connects to the database directly all assigned role and permissions apply. Testing the Setup To test the setup you can use SQL*Plus and connect to your database: $ sqlplus app_user[hr]/abcd1234 Java Persistence API The Java Persistence API (JPA) is a fairly easy means to build applications that retrieve data from the database and put it into Java objects. You use plain old Java objects (POJOs) and mixin some Java annotations that define how the attributes of the object are used for storing data from the database into the Java object. Here is a sample for objects from the HR sample schema EMPLOYEES table. When using Java annotations you only specify what can not be deduced from the code. If your Java class name is Employee but the table name is EMPLOYEES, you need to specify the table name, otherwise it will fail. package demo.proxy.ejb; import java.io.Serializable; import java.sql.Timestamp; import java.util.List; import javax.persistence.Column; import javax.persistence.Entity; import javax.persistence.Id; import javax.persistence.JoinColumn; import javax.persistence.ManyToOne; import javax.persistence.NamedQueries; import javax.persistence.NamedQuery; import javax.persistence.OneToMany; import javax.persistence.Table; @Entity @NamedQueries({ @NamedQuery(name = "Employee.findAll", query = "select o from Employee o") }) @Table(name = "EMPLOYEES") public class Employee implements Serializable { @Column(name="COMMISSION_PCT") private Double commissionPct; @Column(name="DEPARTMENT_ID") private Long departmentId; @Column(nullable = false, unique = true, length = 25) private String email; @Id @Column(name="EMPLOYEE_ID", nullable = false) private Long employeeId; @Column(name="FIRST_NAME", length = 20) private String firstName; @Column(name="HIRE_DATE", nullable = false) private Timestamp hireDate; @Column(name="JOB_ID", nullable = false, length = 10) private String jobId; @Column(name="LAST_NAME", nullable = false, length = 25) private String lastName; @Column(name="PHONE_NUMBER", length = 20) private String phoneNumber; private Double salary; @ManyToOne @JoinColumn(name = "MANAGER_ID") private Employee employee; @OneToMany(mappedBy = "employee") private List employeeList; public Employee() { } public Employee(Double commissionPct, Long departmentId, String email, Long employeeId, String firstName, Timestamp hireDate, String jobId, String lastName, Employee employee, String phoneNumber, Double salary) { this.commissionPct = commissionPct; this.departmentId = departmentId; this.email = email; this.employeeId = employeeId; this.firstName = firstName; this.hireDate = hireDate; this.jobId = jobId; this.lastName = lastName; this.employee = employee; this.phoneNumber = phoneNumber; this.salary = salary; } public Double getCommissionPct() { return commissionPct; } public void setCommissionPct(Double commissionPct) { this.commissionPct = commissionPct; } public Long getDepartmentId() { return departmentId; } public void setDepartmentId(Long departmentId) { this.departmentId = departmentId; } public String getEmail() { return email; } public void setEmail(String email) { this.email = email; } public Long getEmployeeId() { return employeeId; } public void setEmployeeId(Long employeeId) { this.employeeId = employeeId; } public String getFirstName() { return firstName; } public void setFirstName(String firstName) { this.firstName = firstName; } public Timestamp getHireDate() { return hireDate; } public void setHireDate(Timestamp hireDate) { this.hireDate = hireDate; } public String getJobId() { return jobId; } public void setJobId(String jobId) { this.jobId = jobId; } public String getLastName() { return lastName; } public void setLastName(String lastName) { this.lastName = lastName; } public String getPhoneNumber() { return phoneNumber; } public void setPhoneNumber(String phoneNumber) { this.phoneNumber = phoneNumber; } public Double getSalary() { return salary; } public void setSalary(Double salary) { this.salary = salary; } public Employee getEmployee() { return employee; } public void setEmployee(Employee employee) { this.employee = employee; } public List getEmployeeList() { return employeeList; } public void setEmployeeList(List employeeList) { this.employeeList = employeeList; } public Employee addEmployee(Employee employee) { getEmployeeList().add(employee); employee.setEmployee(this); return employee; } public Employee removeEmployee(Employee employee) { getEmployeeList().remove(employee); employee.setEmployee(null); return employee; } } JPA could be used in standalone applications and Java EE containers. In both worlds you normally create a Facade to retrieve or store the values of the Entities to or from the database. The Facade does this via an EntityManager which will be injected by the Java EE container. Here is sample Facade Session Bean for a Java EE container. package demo.proxy.ejb; import java.util.HashMap; import java.util.List; import javax.ejb.Local; import javax.ejb.Remote; import javax.ejb.Stateless; import javax.persistence.EntityManager; import javax.persistence.PersistenceContext; import javax.persistence.Query; import javax.interceptor.AroundInvoke; import javax.interceptor.InvocationContext; import oracle.jdbc.driver.OracleConnection; import org.eclipse.persistence.config.EntityManagerProperties; import org.eclipse.persistence.internal.jpa.EntityManagerImpl; @Stateless(name = "DataFacade", mappedName = "ProxyUser-TestEJB-DataFacade") @Remote @Local public class DataFacadeBean implements DataFacade, DataFacadeLocal { @PersistenceContext(unitName = "TestEJB") private EntityManager em; private String username; public Object queryByRange(String jpqlStmt, int firstResult, int maxResults) { // setSessionUser(); Query query = em.createQuery(jpqlStmt); if (firstResult 0) { query = query.setFirstResult(firstResult); } if (maxResults 0) { query = query.setMaxResults(maxResults); } return query.getResultList(); } public Employee persistEmployee(Employee employee) { // setSessionUser(); em.persist(employee); return employee; } public Employee mergeEmployee(Employee employee) { // setSessionUser(); return em.merge(employee); } public void removeEmployee(Employee employee) { // setSessionUser(); employee = em.find(Employee.class, employee.getEmployeeId()); em.remove(employee); } /** select o from Employee o */ public List getEmployeeFindAll() { Query q = em.createNamedQuery("Employee.findAll"); return q.getResultList(); } Putting Both Together To use Proxy Authentication with JPA and within a Java EE container you have to take care of the additional requirements: Use an OCI JDBC driver Provide the user name that connects through the proxy user Use an OCI JDBC driver To use the OCI JDBC driver you need to set up your JDBC data source file to use the correct JDBC URL. hr jdbc:oracle:oci8:@(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=localhost)(PORT=1521))(CONNECT_DATA=(SID=XE))) oracle.jdbc.OracleDriver user app_user 62C32F70E98297522AD97E15439FAC0E SQL SELECT 1 FROM DUAL jdbc/hrDS Application Additionally you need to make sure that the version of the shared libraries of the OCI driver match the version of the JDBC driver in your Java EE container or Java application and are within your PATH (on Windows) or LD_LIBRARY_PATH (on most Unix-based systems). Installing the Oracle Database Instance Client software works perfectly. Provide the user name that connects through the proxy user This part needs some modification of your application software and session facade. Session Facade Changes In the Session Facade we must ensure that every call that goes through the EntityManager must be prepared correctly and uniquely assigned to this session. The second is really important, as the EntityManager works with a connection pool and can not guarantee that we set the proxy user on the connection that will be used for the database activities. To avoid changing every method call of the Session Facade we provide a method to set the username of the user that connects through the proxy user. This method needs to be called by the Facade client bfore doing anything else. public void setUsername(String name) { username = name; } Next we provide a means to instruct the TopLink EntityManager Delegate to use Oracle Proxy Authentication. (I love small helper methods to hide the nitty-gritty details and avoid repeating myself.) private void setSessionUser() { setSessionUser(username); } private void setSessionUser(String user) { if (user != null && !user.isEmpty()) { EntityManagerImpl emDelegate = ((EntityManagerImpl)em.getDelegate()); emDelegate.setProperty(EntityManagerProperties.ORACLE_PROXY_TYPE, OracleConnection.PROXYTYPE_USER_NAME); emDelegate.setProperty(OracleConnection.PROXY_USER_NAME, user); emDelegate.setProperty(EntityManagerProperties.EXCLUSIVE_CONNECTION_MODE, "Always"); } } The final step is use the EJB 3.0 AroundInvoke interceptor. This interceptor will be called around every method invocation. We therefore check whether the Facade methods will be called or not. If so, we set the user for proxy authentication and the normal method flow continues. @AroundInvoke public Object proxyInterceptor(InvocationContext invocationCtx) throws Exception { if (invocationCtx.getTarget() instanceof DataFacadeBean) { setSessionUser(); } return invocationCtx.proceed(); } Benefits Using Oracle Proxy Authentification has a number of additional benefits appart from implementing the role model of your application: Fine grained access control for temporary users of the account, without compromising the original password. Enabling database auditing and logging. Better identification of performance bottlenecks. References Effective Oracle Database 10g Security by Design, David Knox TopLink Developer's Guide, Chapter 98

    Read the article

  • Microsoft Declares the Future of ASP.NET is Web API

    - by sbwalker
    Sitting on a plane on my way home from Tech Ed 2012 in Orlando, I thought it would be a good time to jot down some key takeaways from this year’s conference. Some of these items I have known since the Microsoft MVP Summit which occurred in Redmond in late February ( but due to NDA restrictions I could not share them with the developer community at large ) and some of them are a result of insightful conversations with a wide variety of industry insiders and Microsoft employees at the conference. First, let’s travel back in time 4 years to the Microsoft MVP Summit in 2008. Microsoft was facing some heat from market newcomer Ruby on Rails and responded with a new web development framework of its own, ASP.NET MVC. At the Summit they estimated that MVC would only be applicable for ~10% of all new web development projects. Based on that prediction I questioned why they were investing such considerable resources for such a relative edge case, but my guess is that they felt it was an important edge case at the time as some of the more vocal .NET evangelists as well as some very high profile start-ups ( ie. Twitter ) had publicly announced their intent to use Rails. Microsoft made a lot of noise about MVC. In fact, they focused so much of their messaging and marketing hype around MVC that it appeared that WebForms was essentially dead. Yes, it may have been true that Microsoft continued to invest in WebForms, but from an outside perspective it really appeared that MVC was the only framework getting any real attention. As a result, MVC started to gain market share. An inside source at Microsoft told me that MVC usage has grown at a rate of about 5% per year and now sits at ~30%. Essentially by focusing so much marketing effort on MVC, Microsoft actually created a larger market demand for it.  This is because in the Microsoft ecosystem there is somewhat of a bandwagon mentality amongst developers. If Microsoft spends a lot of time talking about a specific technology, developers get the perception that it must be really important. So rather than choosing the right tool for the job, they often choose the tool with the most marketing hype and then try to sell it to the customer. In 2010, I blogged about the fact that MVC did not make any business sense for the DotNetNuke platform. This was because our ecosystem relied on third party extensions which were dependent on the WebForms model. If we migrated the core to MVC it would mean that all of the third party extensions would no longer be compatible, which would be an irresponsible business decision for us to make at the expense of our users and customers. However, this did not stop the debate from continuing to occur in our ecosystem. Clearly some developers had drunk Microsoft’s Kool-Aid about MVC and were of the mindset, to paraphrase an old Scottish saying, “If its not MVC, it’s crap”. Now, this is a rather ignorant position to take as most of the benefits of MVC can be achieved in WebForms with solid architecture and responsible coding practices. Clean separation of concerns, unit testing, and direct control over page output are all possible in the WebForms model – it just requires diligence and discipline. So over the past few years some horror stories have begun to bubble to the surface of software development projects focused on ground-up rewrites of web applications for the sole purpose of migrating from WebForms to MVC. These large scale rewrites were typically initiated by engineering teams with only a single argument driving the business decision, that Microsoft was promoting MVC as “the future”. These ill-fated rewrites offered no benefit to end users or customers and in fact resulted in a less stable, less scalable and more complicated systems – basically taking one step forward and two full steps back. A case in point is the announcement earlier this week that a popular open source .NET CMS provider has decided to pull the plug on their new MVC product which has been under active development for more than 18 months and revert back to WebForms. The availability of multiple server-side development models has deeply fragmented the Microsoft developer community. Some folks like to compare it to the age-old VB vs. C# language debate. However, the VB vs. C# language debate was ultimately more of a religious war because at least the two dominant programming languages were compatible with one another and could be used interchangeably. The issue with WebForms vs. MVC is much more challenging. This is because the messaging from Microsoft has positioned the two solutions as being incompatible with one another and as a result web developers feel like they are forced to choose one path or another. Yes, it is true that it has always been technically possible to use WebForms and MVC in the same project, but the tooling support has always made this feel “dirty”. The fragmentation has also made it difficult to attract newcomers as the perceived barrier to entry for learning ASP.NET has become higher. As a result many new software developers entering the market are gravitating to environments where the development model seems more simple and intuitive ( ie. PHP or Ruby ). At the same time that the Web Platform team was busy promoting ASP.NET MVC, the Microsoft Office team has been promoting Sharepoint as a platform for building internal enterprise web applications. Sharepoint has great penetration in the enterprise and over time has been enhanced with improved extensibility capabilities for software developers. But, like many other mature enterprise ASP.NET web applications, it is built on the WebForms development model. Similar to DotNetNuke, Sharepoint leverages a rich third party ecosystem for both generic web controls and more specialized WebParts – both of which rely on WebForms. So basically this resulted in a situation where the Web Platform group had headed off in one direction and the Office team had gone in another direction, and the end customer was stuck in the middle trying to figure out what to do with their existing investments in Microsoft technology. It really emphasized the perception that the left hand was not speaking to the right hand, as strategically speaking there did not seem to be any high level plan from Microsoft to ensure consistency and continuity across the different product lines. With the introduction of ASP.NET MVC, it also made some of the third party control vendors scratch their heads, and wonder what the heck Microsoft was thinking. The original value proposition of ASP.NET over Classic ASP was the ability for web developers to emulate the highly productive desktop development model by using abstract components for creating rich, interactive web interfaces. Web control vendors like Telerik, Infragistics, DevExpress, and ComponentArt had all built sizable businesses offering powerful user interface components to WebForms developers. And even after MVC was introduced these vendors continued to improve their products, offering greater productivity and a superior user experience via AJAX to what was possible in MVC. And since many developers were comfortable and satisfied with these third party solutions, the demand remained strong and the third party web control market continued to prosper despite the availability of MVC. While all of this was going on in the Microsoft ecosystem, there has also been a fundamental shift in the general software development industry. Driven by the explosion of Internet-enabled devices, the focus has now centered on service-oriented architecture (SOA). Service-oriented architecture is all about defining a public API for your product that any client can consume; whether it’s a native application running on a smart phone or tablet, a web browser taking advantage of HTML5 and Javascript, or a rich desktop application running on a PC. REST-based services which utilize the less verbose characteristics of JSON as a transport mechanism, have become the preferred approach over older, more bloated SOAP-based techniques. SOA also has the benefit of producing a cross-platform API, as every major technology stack is able to interact with standard REST-based web services. And for web applications, more and more developers are turning to robust Javascript libraries like JQuery and Knockout for browser-based client-side development techniques for calling web services and rendering content to end users. In fact, traditional server-side page rendering has largely fallen out of favor, resulting in decreased demand for server-side frameworks like Ruby on Rails, WebForms, and (gasp) MVC. In response to these new industry trends, Microsoft did what it always does – it immediately poured some resources into developing a solution which will ensure they remain relevant and competitive in the web space. This work culminated in a new framework which was branded as Web API. It is convention-based and designed to embrace native HTTP standards without copious layers of abstraction. This framework is designed to be the ultimate replacement for both the REST aspects of WCF and ASP.NET MVC Web Services. And since it was developed out of band with a dependency only on ASP.NET 4.0, it means that it can be used immediately in a variety of production scenarios. So at Tech Ed 2012 it was made abundantly clear in numerous sessions that Microsoft views Web API as the “Future of ASP.NET”. In fact, one Microsoft PM even went as far as to say that if we look 3-4 years into the future, that all ASP.NET web applications will be developed using the Web API approach. This is a fairly bold prediction and clearly telegraphs where Microsoft plans to allocate its resources going forward. Currently Web API is being delivered as part of the MVC4 package, but this is only temporary for the sake of convenience. It also sounds like there are still internal discussions going on in terms of how to brand the various aspects of ASP.NET going forward – perhaps the moniker of “ASP.NET Web Stack” coined a couple years ago by Scott Hanselman and utilized as part of the open source release of ASP.NET bits on Codeplex a few months back will eventually stick. Web API is being positioned as the unification of ASP.NET – the glue that is able to pull this fragmented mess back together again. The  “One ASP.NET” strategy will promote the use of all frameworks - WebForms, MVC, and Web API, even within the same web project. Basically the message is utilize the appropriate aspects of each framework to solve your business problems. Instead of navigating developers to a fork in the road, the plan is to educate them that “hybrid” applications are a great strategy for delivering solutions to customers. In addition, the service-oriented approach coupled with client-side development promoted by Web API can effectively be used in both WebForms and MVC applications. So this means it is also relevant to application platforms like DotNetNuke and Sharepoint, which means that it starts to create a unified development strategy across all ASP.NET product lines once again. And so what about MVC? There have actually been rumors floated that MVC has reached a stage of maturity where, similar to WebForms, it will be treated more as a maintenance product line going forward ( MVC4 may in fact be the last significant iteration of this framework ). This may sound alarming to some folks who have recently adopted MVC but it really shouldn’t, as both WebForms and MVC will continue to play a vital role in delivering solutions to customers. They will just not be the primary area where Microsoft is spending the majority of its R&D resources. That distinction will obviously go to Web API. And when the question comes up of why not enhance MVC to make it work with Web API, you must take a step back and look at this from the higher level to see that it really makes no sense. MVC is a server-side page compositing framework; whereas, Web API promotes client-side page compositing with a heavy focus on web services. In order to make MVC work well with Web API, would require a complete rewrite of MVC and at the end of the day, there would be no upgrade path for existing MVC applications. So it really does not make much business sense. So what does this have to do with DotNetNuke? Well, around 8-12 months ago we recognized the software industry trends towards web services and client-side development. We decided to utilize a “hybrid” model which would provide compatibility for existing modules while at the same time provide a bridge for developers who wanted to utilize more modern web techniques. Customers who like the productivity and familiarity of WebForms can continue to build custom modules using the traditional approach. However, in DotNetNuke 6.2 we also introduced a new Service Framework which is actually built on top of MVC2 ( we chose to leverage MVC because it had the most intuitive, light-weight REST implementation in the .NET stack ). The Services Framework allowed us to build some rich interactive features in DotNetNuke 6.2, including the Messaging and Notification Center and Activity Feed. But based on where we know Microsoft is heading, it makes sense for the next major version of DotNetNuke ( which is expected to be released in Q4 2012 ) to migrate from MVC2 to Web API. This will likely result in some breaking changes in the Services Framework but we feel it is the best approach for ensuring the platform remains highly modern and relevant. The fact that our development strategy is perfectly aligned with the “One ASP.NET” strategy from Microsoft means that our customers and developer community can be confident in their current and future investments in the DotNetNuke platform.

    Read the article

  • SOA Community Newsletter: nouvelle lettre !

    - by mseika
    SOA PARTNER COMMUNITY NEWSLETTERAUGUST 2012 Dear SOA partner community member Have you submitted your feedback on SOA Partner Community Survey 2012? This is the last chance to participate in the survey. We recommend you to complete the survey and help us to improve our SOA Community. Thanks to all attendees and trainers for their participation in the excellent Fusion Middleware Summer Camps held in Lisbon and Munich. I would also like to thank you for the great feedback and the nice reports provided by AMIS Technology Blog & Middleware by Link Consulting. Most of our courses have been overbooked, if you did not get a chance or missed it, we offer a wide range of online training and the course material. Key take-away from the advanced BPM course is to become an expert in ADF. Here is the course from Grant Ronald Learn Advanced ADF online available. The Link Consulting Team became experts in SOA Governance with EAMS and Oracle Enterprise Repository! We always encourage our community members to share their best practices and are very keen to publish it. Please let us know if you want to share your best practices through this medium.We encourage you to make use of the Specialization benefits - this month we are giving an opportunity to Promote Your SOA & BPM Events. Jürgen KressOracle SOA & BPM Partner Adoption EMEA NEW CONTENT Presentations & Training material OFM Summer CampsPromote Your SOA & BPM Events Advanced ADF Online, For Free By Grant BPM 11g Customer Stories & Solution Catalog & Process Accelerators Delivering SOA Governance with EAMS by Link Consulting Team WebLogic Server Provisioning and Patching News from our Partners & CommunityUpdated material by Oracle Connect and Network SOA Blogs SOA on Facebook SOA on LinkedIn SOA on Twitter Mix SOA Forum SOA Workspace PRESENTATIONS & TRAINING MATERIAL OFM SUMMER CAMPS Thanks to all attendees who invested their time and utilized the opportunity to attend the Summer Camps! Due to high demand of our most of the trainings, we had a long waiting list with more numbers of partners who are keen to attend it. We would like to give our special thanks to all trainers, who delivered excellent workshops! Most of the presentations and course material have been posted on our SOA Community Workspaceand WebLogic Community Workspace. You can access the content only if you are a registered community member. To register for the SOA Community please click here. You can register for the WebLogic Community here. To find out the first impressions of the event please visit our Facebook pages:www.facebook.com/WebLogicCommunity &www.facebook.com/soacommunity or Picasa AlbumThanks for the excellent blog posts from AMIS Technology Blog & Middleware by Link Consulting. Let us know if you published a twitter blog on@soacommunity & @wlscommunity. We will be pleased to publish it in our Newsletters. BPM Course Quotes “Its always easy, if you know, what you are doing” - Torsten Winterberg, Opitz“ The best ideas are the ideas from the best” - Filipe Sequeria, Primesoft “Best invest in the education in the last 12 months” - Richard Schaller, IPT “Practice best practice with the best instructor” - Graham Lamond Capgemini “If you have basic BPM knowledge, this is the course to really mater it” - Diogo Henriques Link Consulting “Very good trainers lot of work. Lot of fun as well” - Matthias Gris Workflow Factory “If you like to accelerate in Oracle come to the training to bring it all together” - Marcel van der Glind, Amis ADF Course Quotes "Excellent training, great opportunity to network!" - Frank Houweling, Amis "Lots of fun and good ideas" - Ana Santiago, GFI "Learn ADF, worth it Fusion Apps is the future" - Miguel Delgadillo, STO Consulting "The best way to learn Fusion Middleware from the #1" Alexandro Montantes, STO Consulting "Be advanced to to be the first” - Dimitar Petrov Fadata "Great opportunity to suck all the knowledge out of some very experienced product managers” - Wilfred von der Deijl, The Future Group WebLogic Course Quotes “Oracle trainings are the best” - Pedro Neto Novobas“ "Excellent training, well organized” - Pedro Antunh, Capgemini “This course dives you into Oracle WebLogic giving you a quick start on benefiting from Fusion Apps” - Leonardo Fernandes, Outsystems Additional Quotes “Thanks a lot again for organizing such a great and informative Summer Camp. Both training and networking were organized very professionally. I have gained tons of very useful Info, which will definitely help to increase quality of our future projects.” - Daniel Fasko fss-group.com I didn’t get the chance yesterday to thank you for a most enjoyable and thoroughly educational time I had in Munich over the last few days.” - Jeroen Bakker Ordina “Just to congratulate you on a great event, not only today but also in the previous days of training. As we know, a very good organization and, as a native Portuguese that knows Lisbon very good, a nice choice of places to visit. Looking forward to come again next year.” Pedro Miguel Neto, Novobase PROMOTE YOUR SOA & BPM EVENTS The Partner Event Publisher has just been made available to all SOA & BPM specialized partners in EMEA. Partners now have the opportunity to publish their events to theOracle.com/events site and spread the word on their upcoming live in-person and/or live webcast events. See the demo below and click here to read more information. ADVANCED ADF ONLINE, FOR FREE BY GRANT The second part of the advanced ADF online eCourse is Live now! This covers the advanced topics of region and region interaction as well as getting down and dirty with some of the layout features of ADF Faces, skinning and DVT components. The aim of this course is to give you a self-paced learning aid which covers the more advanced topics of ADF development. The content is developed by Product Management and our Curriculum development teams and is based on advanced training material we have been running internally for about 18 months. We will get started on the next chapter, but in the meantime, please have a look at chapters one and two. Back to top BPM 11G CUSTOMER STORIES & SOLUTION CATALOG & PROCESS ACCELERATORS Stories Everyone loves a good story on planning or implementing a BPM strategy. Everyone wants to hear how it was done before?, what worked?, what was achieved? If you have achieved success with BPM, we are very keen to hear your stories and examples of how your customers use it. We receive lots of requests from people who are thinking of using BPM to solve a specific problem or in combination with a specific technology to talk to someone who has done it before. These stories are invaluable. Drop down the details of anything you think is relevant with a bit of detail and we will follow up on it. As one good deed deserves another, we will do our best to give you stories if you need them to show that where you are going, others have treaded before. Send your stories to us using this e-mail link and we will share them among other like minded people. Solution Catalogue This summer, Oracle is launching a solution catalogue specifically intended for partners. If you have delivered a successful implementation in BPM and think it could be reused and applied again in a similar scenario in the same industry or in a similar environment, then we ware keen to know about it and will add it to the solution catalogue. The solution catalogue will showcase successful BPM solutions both inside and outside Oracle. Be in touch with us on this e-mail link and we will make sure to add your solution. Process AcceleratorsFinally if you have specific processes that you are expert on, you have implemented at a customer and you want to work with us on getting these productised, then we would love to know about it. The process accelerator programme is explained in the most recent SOA/BPM Community Newsletter but again feel free to contact us if you want to get involved. Good luck with BPM and let us know how we can help. Barry O'Reilly Director BPM [email protected] DELIVERING SOA GOVERNANCE WITH EAMS BY LINK CONSULTING TEAM In the last 12 years Link Consulting has been making its presence in specific areas such as Governance and Architecture, both in terms of practices and methodologies, products, know-how and technological expertise. The Enterprise Architecture Management System - Oracle Enterprise Edition (EAMS - OER Edition) is the result of this experience and combines the architecture management solution with OER in order to deliver a product specialized for SOA Governance that gathers the better of two worlds in solution that enables SOA Governance projects, initiatives and programs. Enterprise Architecture Management System Enterprise Architecture Management System (EAMS), is an automation based solution that enables the efficient management of Enterprise Architectures. The solution uses configured enterprise repositories and takes advantages of its features to provide automation capabilities to the users. EAMS provides capabilities to create/customize/analyze repository data, architectural blueprints, reports and analytic charts. Oracle Enterprise Repository Oracle Enterprise Repository (OER) is one of the major and central elements of the Oracle SOA Governance solution. Oracle Enterprise Repository provides the tools to manage and govern the metadata for any type of software asset, from business processes and services to patterns, frameworks, applications, components, and models. OER maps the relationships and inter-dependencies that connect those assets to improve impact analysis, promote and optimize their reuse, and measure their impact on the bottom line. It provides the visibility, feedback, controls, and analytics to keep your SOA on track to deliver business value. The intense focus on automation helps to overcome barriers to SOA adoption and streamline governance throughout the lifecycle. Core capabilities of the OER include: Asset Management Asset Lifecycle Management Usage Tracking Service Discovery Version Management Dependency Analysis Portfolio Management EAMS - OER Edition The solution takes the advantages and features from both products and combines them in a symbiotic tool that enhances the quality of SOA Governance Initiatives and Programs. EAMS is able to produce a vast number of outputs by combining its analytical engine, SOA-specific configurations and the assets in OER and other related tools, catalogs and repositories. The configurations encompass not only the extendable parametrization of the metadata but also fully configurable blueprints, PowerPoint reports, charts and queries. The SOA blueprints The solution comes with a set of predefined architectural representations that help the organization better perceive their SOA landscape. More blueprints can be easily created in order to accommodate the organizations needs in terms of detail, audience and metadata. Charts & Dashboards The solution encompasses a set of predefined charts and dashboards that promote a more agile way to control and explore the assets. Time Based Visualization All representations are time bound, and with EAMS - OER you can truly govern SOA with a complete view of the Past, Present and Future; The solution delivers Gap Analysis, a project oriented approach while taking into consideration the As-Was, As-Is an To-Be. Time based visualization differentiating factors: Extensive automation and maintenance of architectural representations Organization wide solution. Easy access and navigation to and between all architectural artifacts and representations. Flexible meta-model, customization and extensibility capabilities. Lifecycle management and enforcement of the time dimension over all the repository content. Profile based customization. Comprehensive visibility Architectural alignment Friendly and striking user interfaces For more information on EAMS visit us here. For more information on SOA visit us here. WEBLOGIC SERVER PROVISIONING AND PATCHING For access to the Oracle demo systems please visit OPN and talk to your Partner Expert.SOA Suite and BPM Suite runs on WebLogic! We are pleased to announce the availability of a WebLogic Server Management demo that showcases some of the key provisioning and patching capabilities of WebLogic Server Management Pack Enterprise Edition (EE). To learn more about these features - as well as other features of the pack - please visit the pack's saleskit page.Demo Highlights The demo showcases the following capabilities: Patching Oracle WebLogic Servers Standardizing WebLogic Server Patch Rollouts Creating a WebLogic Domain Provisioning Profile Cloning a WebLogic Domain from a Provisioning Profile Deploying a Java EE Application Scaling Out an Oracle WebLogic Cluster Demo Instructions Go to the DSS website for Oracle Partners. On the Standard Demo Launchpad page, under the “Software Lifecycle Automation” section, click on the link “EM Cloud Control 12c WLS Provisioning and Patching” (tagged as “NEW”). Specific demo launchpad page contains a link to the detailed demo script with instructions on how to show the demo.

    Read the article

  • Stop Spinning Your Wheels&hellip; Sage Advice for Aspiring Developers

    - by Mark Rackley
    So… lately I’ve been tasked with helping bring some non-developers over the hump and become full-fledged, all around, SharePoint developers. Well, only time will tell if I’m successful or a complete failure. Good thing about failures though, you know what NOT to do next time! Anyway, I’ve been writing some sort of code since I was about 10 years old; so I sometimes take for granted the effort some people have to go through to learn a new technology. I guess if I had to say I was an “expert” in one thing it would be learning (and getting “stuff” done) in new technologies. Maybe that’s why I’ve embraced SharePoint and the SharePoint community. SharePoint is the first technology I haven’t been able to master or get everything done without help from other people. I KNOW I’ll never know it all and I learn something new every day.  It keeps it interesting, it keeps me motivated, and keeps me involved. So, what some people may consider a downside of SharePoint, I definitely consider a plus. Crap.. I’m rambling. Where was I? Oh yeah… me trying to be helpful. Like I said, I am able to quickly and effectively pick up new languages, technology, etc. and put it to good use. Am I just brilliant? Well, my mom thinks so.. but maybe not. Maybe I’ve just been doing it for a long time…. 25 years in some form or fashion… wow I’m old… Anyway, what I lack in depth I make up for in breadth and being the “go-to” guy wherever I work when someone needs to “get stuff done”.  Let’s see if I can take some of that experience and put it to practical use to help new people get up to speed faster, learn things more effectively, and become that go-to guy. First off…  make sure you… Know The Basics I don’t have the time to teach new developers the basics, but you gotta know them. I’ve only been “taught” two languages.. Fortran 77 and C… everything else I’ve picked up from “doing”. I HAD to know the basics though, and all new developers need to understand the very basics of development.  97.23% of all languages will have the following: Variables Functions Arrays If statements For loops / While loops If you think about it, most development is “if this, do this… or while this, do this…”.  “This” may be some unique method to your language or something you develop, but the basics are the basics. YES there are MANY other development topics you need to understand, but you shouldn’t be scratching your head trying to figure out what a ”for loop” is… (Also learn about classes and hashtables as quickly as possible). Once you have the basics down it makes it much easier to… Learn By Doing This may just apply to me and my warped brain.  I don’t learn a new technology by reading or hearing someone speak about it. I learn by doing. It does me no good to try and learn all of the intricacies of a new language or technology inside-and-out before getting my hands dirty. Just show me how to do one thing… let me get that working… then show me how to do the next thing.. let me get that working… Now, let’s see what I can figure out on my own. Okay.. now it starts to make sense. I see how the language works, I can step through the code, and before you know it.. I’m productive in a new technology. Be careful here though…. make sure you… Don’t Reinvent The Wheel People have been writing code for what… 50+ years now? So, why are you trying to tackle ANYTHING without first Googling it with Bing to see what others have done first? When I was first learning C# (I had come from a Java background) I had to call a web service.  Sure! No problem! I’d done this many times in Java. So, I proceeded to write an HTTP Handler, called the Web Service and it worked like a charm!!!  Probably about 2.3 seconds after I got it working completely someone says to me “Why didn’t you just add a Web Reference?” Really? You can do that?  oops… I just wasted a lot of time. Before undertaking the development of any sort of utility method in a new language, make sure it’s not already handled for you… Okay… you are starting to write some code and are curious about the possibilities? Well… don’t just sit there… Try It And See What Happens This is actually one of my biggest pet peeves. “So… ‘x++’ works in C#, but does it also work in JavaScript?”   Really? Did you just ask me that? In the time it spent for you to type that email, press the send button, me receive the email, get around to reading it, and replying with “yes” you could have tested it 47 times and know the answer! Just TRY it! See what happens! You aren’t doing brain surgery. You aren’t going to kill anyone, and you BETTER not be developing in production. So, you are not going to crash any production systems!! Seriously! Get off your butt and just try it yourself. The extra added benefit is that it doesn’t work, the absolute best way to learn is to… Learn From Your Failures I don’t know about you… but if I screw up and something doesn’t work, I learn A LOT more debugging my problem than if everything magically worked. It’s okay that you aren’t perfect! Not everyone can be me? In the same vein… don’t ask someone else to debug your problem until you have made a valiant attempt to do so yourself. There’s nothing quite like stepping through code line by line to see what it’s REALLY doing… and you’ll never feel more stupid sometimes than when you realize WHY it’s not working.. but you realize... you learn... and you remember. There is nothing wrong with failure as long as you learn from it. As you start writing more and more and more code make sure that you ALWAYS… Develop for Production You will soon learn that the “prototype” you wrote last week to show as a “proof of concept” is going to go directly into production no matter how much you beg and plead and try to explain it’s not ready to go into production… it’s going to go straight there.. and it’s like herpes.. it doesn’t go away and there’s no fixing it once it’s in there.  So, why not write ALL your code like it will be put in production? It MIGHT take a little longer, but in the long run it will be easier to maintain, get help on, and you won’t be embarrassed that it’s sitting on a production server for everyone to use and see. So, now that you are getting comfortable and writing code for production it is important to to remember the… KISS Principle… Learn It… Love It… Keep It Simple Stupid Seriously.. don’t try to show how smart you are by writing the most complicated code in history. Break your problem up into discrete steps and write each step. If it turns out you have some redundancy, you can always go back and tweak your code later.  How bad is it when you write code that LOOKS cocky? I’ve seen it before… some of the most abstract and complicated classes when a class wasn’t even needed! Or the most elaborate unreadable code jammed into one really long line when it could have been written in three lines, performed just as well, and been SOOO much easier to maintain. Keep it clear and simple.. baby steps people. This will help you learn the technology, debug problems, AND it will help others help you find your problems if they don’t have to decipher the Dead Sea Scrolls just to figure out what you are trying to do…. Really.. don’t be that guy… try to curb your ego and… Keep an Open Mind No matter how smart you are… how fast you type… or how much you get paid, don’t let your ego get in the way. There is probably a better way to do everything you’ve ever done. Don’t become so cocky that you can’t think someone knows more than you. There’s a lot of brilliant, helpful people out there willing to show you tricks if you just give them a chance. A very super-awesome developer once told me “So what if you’ve been writing code for 10 years or more! Does your code look basically the same? Are you not growing as a developer?” Those 10 years become pretty meaningless if you just “know” that you are right and have not picked up new tips, tricks, methods, and patterns along the way. Learn from others and find out what’s new in development land (you know you don’t have to specifically use pointers anymore??). Along those same lines… If it’s not working, first assume you are doing something wrong. You have no idea how much it annoys people who are trying to help you when you first assume that the help they are trying to give you is wrong. Just MAYBE… you… the person learning is making some small mistake? Maybe you didn’t describe your problem correctly? Maybe you are using the wrong terminology? “I did exactly what you said and it didn’t work.”  Oh really? Are you SURE about that? “Your solution doesn’t work.”  Well… I’m pretty sure it works, I’ve used it 200 times… What are you doing differently? First try some humility and appreciation.. it will go much further, especially when it turns out YOU are the one that is wrong. When all else fails…. Try Professional Training Some people just don’t have the mindset to go and figure stuff out. It’s a gift and not everyone has it. If everyone could do it I wouldn’t have a job and there wouldn’t be professional training available.  So, if you’ve tried everything else and no light bulbs are coming on, contact the experts who specialize in training. Be careful though, there is bad training out there. Want to know the names of some good places? Just shoot me a message and I’ll let you know. I’m boycotting endorsing Andrew Connell anymore until I get that free course dangit!! So… that’s it.. that’s all I got right now. Maybe you thought all of this is common sense, maybe you think I’m smoking crack. If so, don’t just sit there, there’s a comments section for a reason. Finally, what about you? What tips do you have to help this aspiring to learn the dark arts??

    Read the article

  • Configuring Oracle iPlanet WebServer / Oracle Traffic Director to use crypto accelerators on T4-1 servers

    - by mv
    Configuring Oracle iPlanet Web Server / Oracle Traffic Director to use crypto accelerators on T4-1 servers Jyri had written a technical article on Configuring Solaris Cryptographic Framework and Sun Java System Web Server 7 on Systems With UltraSPARC T1 Processors. I tried to find out what has changed since then in T4. I have used a T4-1 SPARC system with Solaris 10. Results slightly vary for Solaris 11.  For Solaris 11, the T4 optimization was implemented in libsoftcrypto.so while it was in pkcs11_softtoken_extra.so for Solaris 10. Overview of T4 processors is here in this blog. Many thanx to Chi-Chang Lin and Julien for their help. 1. Install Oracle iPlanet Web Server / Oracle Traffic Director.  Go to instance/config directory.  # cd /opt/oracle/webserver7/https-hostname.fqdn/config 2. List default PKCS#11 Modules # ../../bin/modutil -dbdir . -listListing of PKCS #11 Modules-----------------------------------------------------------1. NSS Internal PKCS #11 Moduleslots: 2 slots attachedstatus: loadedslot: NSS Internal Cryptographic Servicestoken: NSS Generic Crypto Servicesslot: NSS User Private Key and Certificate Servicestoken: NSS Certificate DB2. Root Certslibrary name: libnssckbi.soslots: 1 slot attachedstatus: loadedslot: NSS Builtin Objectstoken: Builtin Object Token----------------------------------------------------------- 3. Initialize the soft token data store in the $HOME/.sunw/pkcs11_softtoken/ directory # pktool setpin keystore=pkcs11Enter token passphrase: olderpasswordCreate new passphrase: passwordRe-enter new passphrase: passwordPassphrase changed. 4. Offload crypto operations to Solaris Crypto Framework on T4 $ ../../bin/modutil -dbdir . -nocertdb -add SCF -libfile /usr/lib/libpkcs11.so -mechanisms RSA:AES:SHA1:MD5 Module "SCF" added to database. Note that -nocertdb means modutil won't try to open the NSS softoken key database. It doesn't even have to be present. PKCS#11 library used is /usr/lib/libpkcs11.so. If the server is running in 64 bit mode, we have to use /usr/lib/64/libpkcs11.so Unlike T1 and T2, in T4 we do not have to disable mechanisms in softtoken provider using cryptoadm. 5. List again to check that a new module SCF is added # ../../bin/modutil -dbdir . -list Listing of PKCS #11 Modules-----------------------------------------------------------1. NSS Internal PKCS #11 Moduleslots: 2 slots attachedstatus: loadedslot: NSS Internal Cryptographic Servicestoken: NSS Generic Crypto Servicesslot: NSS User Private Key and Certificate Servicestoken: NSS Certificate DB2. SCFlibrary name: /usr/lib/libpkcs11.soslots: 2 slots attachedstatus: loadedslot: Sun Metaslottoken: Sun Metaslotslot: n2rng/0 SUNW_N2_Random_Number_Generator token: n2rng/0 SUNW_N2_RNG 3. Root Certs library name: libnssckbi.so slots: 1 slot attached status: loaded slot: NSS Builtin Objects token: Builtin Object Token----------------------------------------------------------- 6.  Create certificate in “Sun Metaslot” : I have used certutil, but you must use Admin Server CLI / GUI # ../../bin/certutil -S -x -n "Server-Cert" -t "CT,CT,CT" -s "CN=*.fqdn" -d . -h "Sun Metaslot"Enter Password or Pin for "Sun Metaslot": password 7. Verify that the certificate is created properly in “Sun Metslaot” # ../../bin/certutil -L -d . -h "Sun Metaslot"Certificate Nickname Trust AttributesSSL,S/MIME,JAR/XPIEnter Password or Pin for "Sun Metaslot": passwordSun Metaslot:Server-Cert CTu,Cu,Cu# 8. Associate this newly created certificate to http listener using Admin CLI/GUI. After that server.xml should have <http-listener> ...    <ssl>        <server-cert-nickname>Sun Metaslot:Server-Cert</server-cert-nicknamer>    </ssl> Note the prefix "Sun Metaslot" 9. Disable PKCS#11 bypass To use the accelerated AES algorithm, turn off PKCS#11 bypass, and configure modutil to have the AES mechanism go to the Metaslot. After you disable PKCS#11 bypasss using Admin GUI/CLI,  check that server.xml should have <server> ....    <pkcs11>         <enabled>1</enabled>         <allow-bypass>0</allow-bypass>     </pkcs11> With PKCS#11 bypass enabled, Oracle iPlanet Web Server will only use the RSA capability of the T4, provided certificate and key are stored in the T4 slot (Metaslot). Actually, the RSA op is never bypassed in NSS, it's always done with PKCS#11 calls. So the bypass settings won't affect the behavior of the probes for RSA at all. The only thing that matters if where the RSA key and certificate live, ie. which PKCS#11 token, and thus which PKCS#11 module gets called to do the work. If your certificate/key are in the NSS certificate/key db, you will see libsoftokn3/libfreebl libraries doing the RSA work. If they are in the Sun Metaslot, it should be the Solaris code. 10. Start the server instance # ../bin/startserv Oracle iPlanet Web Server 7.0.16 B09/14/2012 03:33Please enter the PIN for the "Sun Metaslot" token: password...info: HTTP3072: http-listener-1: https://hostname.fqdn:80 ready to accept requestsinfo: CORE3274: successful server startup 11. Figure out which process to run this DTrace script on # ps -eaf | grep webservd | grep -v dogwebservd 18224 18223 0 13:17:25 ? 0:07 webservd -d /opt/oracle/webserver7/https-hostname.fqdn/config -r /opt/root 18225 18224 0 13:17:25 ? 0:00 webservd -d /opt/oracle/webserver7/https-hostname.fqdn/config -r /opt/ (For Oracle Traffic Director look for process named "trafficd") We see that the child process id is “18225” 12. Clients for testing : You can use any browser. I used NSS tool tstclnt for testing $cat > req.txtGET /index.html HTTP/1.0 For checking both RSA and AES, I used cipher “:0035” which is TLS_RSA_WITH_AES_256_CBC_SHA $./tstclnt -h hostname -p 80 -d . -T -f -o -v -c “:0035” < req.txt 13. How do I make sure that crypto accelerator is being used 13.1 Create DTrace script The following D script should be able to uncover whether T4-specific crypto routine are being called or not. It also displays stats per second. # cat > t4crypto.d#!/usr/sbin/dtrace -spid$target::*rsa*:entry,pid$target::*yf*:entry{    @ops[probemod, probefunc] = count();}tick-1sec{    printa(@ops);    trunc(@ops);} Invoke with './t4crypto.d -p <pid> ' 13.2 EXPECTED PROBES FOR Solaris 10 : If offloading to T4 HW are correctly set up, the expected DTrace output would have these probes and libraries library Operations PROBES pkcs11_softtoken_extra.so RSA soft_decrypt_rsa_pkcs_decode, soft_encrypt_rsa_pkcs_encode soft_rsa_crypt_init_common soft_rsa_decrypt, soft_rsa_encrypt soft_rsa_decrypt_common, soft_rsa_encrypt_common AES yf_aes_instructions_present yf_aes_expand256, yf_aes256_cbc_decrypt, yf_aes256_cbc_encrypt, yf_aes256_load_keys_for_decrypt, yf_aes256_load_keys_for_encrypt, Note that these are for 256, same for 128, 192... these are for cbc, same for ecb, ctr, cfb128... DES yf_des_expand, yf_des_instructions_present yf_des_encrypt libmd_psr.so MD5 yf_md5_multiblock, yf_md5_instruction_present SHA1 yf_sha1_instruction_present, yf_sha1_multibloc 13.3 SAMPLE OUTPUT FOR CIPHER TLS_RSA_WITH_AES_256_CBC_SHA (0x0035) ON T4 SPARC SOLARIS 10 WITHOUT PKCS#11 BYPASS # ./t4crypto.d -p 18225 pkcs11_softtoken_extra.so.1   soft_decrypt_rsa_pkcs_decode    1 pkcs11_softtoken_extra.so.1   soft_rsa_crypt_init_common      1 pkcs11_softtoken_extra.so.1   soft_rsa_decrypt                1 pkcs11_softtoken_extra.so.1   big_mp_mul_yf                   2 pkcs11_softtoken_extra.so.1   mpm_yf_mpmul                    2 pkcs11_softtoken_extra.so.1   mpmul_arr_yf                    2 pkcs11_softtoken_extra.so.1   rijndael_key_setup_enc_yf       2 pkcs11_softtoken_extra.so.1   soft_rsa_decrypt_common         2 pkcs11_softtoken_extra.so.1   yf_aes_expand256                2 pkcs11_softtoken_extra.so.1   yf_aes256_cbc_decrypt           3 pkcs11_softtoken_extra.so.1   yf_aes256_load_keys_for_decrypt 3 pkcs11_softtoken_extra.so.1   big_mont_mul_yf                 6 pkcs11_softtoken_extra.so.1   mm_yf_montmul                   6 pkcs11_softtoken_extra.so.1   yf_des_instructions_present     6 pkcs11_softtoken_extra.so.1   yf_aes256_cbc_encrypt           8 pkcs11_softtoken_extra.so.1   yf_aes256_load_keys_for_encrypt 8 pkcs11_softtoken_extra.so.1   yf_mpmul_present                8 pkcs11_softtoken_extra.so.1   yf_aes_instructions_present    13 pkcs11_softtoken_extra.so.1   yf_des_encrypt                 18 libmd_psr.so.1                yf_md5_multiblock              41 libmd_psr.so.1                yf_md5_instruction_present     72 libmd_psr.so.1                yf_sha1_instruction_present    82 libmd_psr.so.1                yf_sha1_multiblock             82 This indicates that both RSA and AES ops are done in Solaris Crypto Framework. 13.4 SAMPLE OUTPUT FOR CIPHER TLS_RSA_WITH_AES_256_CBC_SHA (0x0035) ON T4 SPARC SOLARIS 10 WITH PKCS#11 BYPASS # ./t4crypto.d -p 18225 pkcs11_softtoken_extra.so.1   soft_decrypt_rsa_pkcs_decode 1 pkcs11_softtoken_extra.so.1   soft_rsa_crypt_init_common   1 pkcs11_softtoken_extra.so.1   soft_rsa_decrypt             1 pkcs11_softtoken_extra.so.1   soft_rsa_decrypt_common      1 pkcs11_softtoken_extra.so.1   big_mp_mul_yf                2 pkcs11_softtoken_extra.so.1   mpm_yf_mpmul                 2 pkcs11_softtoken_extra.so.1   mpmul_arr_yf                 2 pkcs11_softtoken_extra.so.1   big_mont_mul_yf              6 pkcs11_softtoken_extra.so.1   mm_yf_montmul                6 pkcs11_softtoken_extra.so.1   yf_mpmul_present             8 For this cipher, when I enable PKCS#11 bypass, Only RSA probes are being hit AES probes are not being hit. 13.5 ustack() for RSA operations / probefunc == "soft_rsa_decrypt" / Shows that libnss3.so is calling C_* functions of libpkcs11.so which is calling functions of pkcs11_softtoken_extra.so for both cases with and without bypass. When PKCS#11 bypass is disabled (allow-bypass is 0) pkcs11_softtoken_extra.so.1`soft_rsa_decrypt pkcs11_softtoken_extra.so.1`soft_rsa_decrypt_common+0x94 pkcs11_softtoken_extra.so.1`soft_unwrapkey+0x258 pkcs11_softtoken_extra.so.1`C_UnwrapKey+0x1ec libpkcs11.so.1`meta_unwrap_key+0x17c libpkcs11.so.1`meta_UnwrapKey+0xc4 libpkcs11.so.1`C_UnwrapKey+0xfc libnss3.so`pk11_AnyUnwrapKey+0x6b8 libnss3.so`PK11_PubUnwrapSymKey+0x8c libssl3.so`ssl3_HandleRSAClientKeyExchange+0x1a0 libssl3.so`ssl3_HandleClientKeyExchange+0x154 libssl3.so`ssl3_HandleHandshakeMessage+0x440 libssl3.so`ssl3_HandleHandshake+0x11c libssl3.so`ssl3_HandleRecord+0x5e8 libssl3.so`ssl3_GatherCompleteHandshake+0x5c libssl3.so`ssl_GatherRecord1stHandshake+0x30 libssl3.so`ssl_Do1stHandshake+0xec libssl3.so`ssl_SecureRecv+0x1c8 libssl3.so`ssl_Recv+0x9c libns-httpd40.so`__1cNDaemonSessionDrun6M_v_+0x2dc When PKCS#11 bypass is enabled (allow-bypass is 1) pkcs11_softtoken_extra.so.1`soft_rsa_decrypt pkcs11_softtoken_extra.so.1`soft_rsa_decrypt_common+0x94 pkcs11_softtoken_extra.so.1`C_Decrypt+0x164 libpkcs11.so.1`meta_do_operation+0x27c libpkcs11.so.1`meta_Decrypt+0x4c libpkcs11.so.1`C_Decrypt+0xcc libnss3.so`PK11_PrivDecryptPKCS1+0x1ac libssl3.so`ssl3_HandleRSAClientKeyExchange+0xe4 libssl3.so`ssl3_HandleClientKeyExchange+0x154 libssl3.so`ssl3_HandleHandshakeMessage+0x440 libssl3.so`ssl3_HandleHandshake+0x11c libssl3.so`ssl3_HandleRecord+0x5e8 libssl3.so`ssl3_GatherCompleteHandshake+0x5c libssl3.so`ssl_GatherRecord1stHandshake+0x30 libssl3.so`ssl_Do1stHandshake+0xec libssl3.so`ssl_SecureRecv+0x1c8 libssl3.so`ssl_Recv+0x9c libns-httpd40.so`__1cNDaemonSessionDrun6M_v_+0x2dc libnsprwrap.so`ThreadMain+0x1c libnspr4.so`_pt_root+0xe8 13.6 ustack() FOR AES operations / probefunc == "yf_aes256_cbc_encrypt" / When PKCS#11 bypass is disabled (allow-bypass is 0) pkcs11_softtoken_extra.so.1`yf_aes256_cbc_encrypt pkcs11_softtoken_extra.so.1`aes_block_process_contiguous_whole_blocks+0xb4 pkcs11_softtoken_extra.so.1`aes_crypt_contiguous_blocks+0x1cc pkcs11_softtoken_extra.so.1`soft_aes_encrypt_common+0x22c pkcs11_softtoken_extra.so.1`C_EncryptUpdate+0x10c libpkcs11.so.1`meta_do_operation+0x1fc libpkcs11.so.1`meta_EncryptUpdate+0x4c libpkcs11.so.1`C_EncryptUpdate+0xcc libnss3.so`PK11_CipherOp+0x1a0 libssl3.so`ssl3_CompressMACEncryptRecord+0x264 libssl3.so`ssl3_SendRecord+0x300 libssl3.so`ssl3_FlushHandshake+0x54 libssl3.so`ssl3_SendFinished+0x1fc libssl3.so`ssl3_HandleFinished+0x314 libssl3.so`ssl3_HandleHandshakeMessage+0x4ac libssl3.so`ssl3_HandleHandshake+0x11c libssl3.so`ssl3_HandleRecord+0x5e8 libssl3.so`ssl3_GatherCompleteHandshake+0x5c libssl3.so`ssl_GatherRecord1stHandshake+0x30 libssl3.so`ssl_Do1stHandshake+0xec Shows that libnss3.so is calling C_* functions of libpkcs11.so which is calling functions of pkcs11_softtoken_extra.so However when PKCS#11 bypass is disabled (allow-bypass is 1) this stack isn't getting called. 14. LIST OF ALL THE PROBES MATCHED BY D SCRIPT FOR REFERENCE # ./t4crypto.d -p 18225 -l ID PROVIDER MODULE FUNCTION NAME ... 55720 pid18225 libmd_psr.so.1 yf_md5_instruction_present entry 55721 pid18225 libmd_psr.so.1 yf_sha256_instruction_present entry 55722 pid18225 libmd_psr.so.1 yf_sha512_instruction_present entry 55723 pid18225 libmd_psr.so.1 yf_sha1_instruction_present entry 55724 pid18225 libmd_psr.so.1 yf_sha256 entry 55725 pid18225 libmd_psr.so.1 yf_sha256_multiblock entry 55726 pid18225 libmd_psr.so.1 yf_sha512 entry 55727 pid18225 libmd_psr.so.1 yf_sha512_multiblock entry 55728 pid18225 libmd_psr.so.1 yf_sha1 entry 55729 pid18225 libmd_psr.so.1 yf_sha1_multiblock entry 55730 pid18225 libmd_psr.so.1 yf_md5 entry 55731 pid18225 libmd_psr.so.1 yf_md5_multiblock entry 55732 pid18225 pkcs11_softtoken_extra.so.1 yf_aes_instructions_present entry 55733 pid18225 pkcs11_softtoken_extra.so.1 rijndael_key_setup_enc_yf entry 55734 pid18225 pkcs11_softtoken_extra.so.1 yf_aes_expand128 entry 55735 pid18225 pkcs11_softtoken_extra.so.1 yf_aes_encrypt128 entry 55736 pid18225 pkcs11_softtoken_extra.so.1 yf_aes_decrypt128 entry 55737 pid18225 pkcs11_softtoken_extra.so.1 yf_aes_expand192 entry 55738 pid18225 pkcs11_softtoken_extra.so.1 yf_aes_encrypt192 entry 55739 pid18225 pkcs11_softtoken_extra.so.1 yf_aes_decrypt192 entry 55740 pid18225 pkcs11_softtoken_extra.so.1 yf_aes_expand256 entry 55741 pid18225 pkcs11_softtoken_extra.so.1 yf_aes_encrypt256 entry 55742 pid18225 pkcs11_softtoken_extra.so.1 yf_aes_decrypt256 entry 55743 pid18225 pkcs11_softtoken_extra.so.1 yf_aes128_load_keys_for_encrypt entry 55744 pid18225 pkcs11_softtoken_extra.so.1 yf_aes192_load_keys_for_encrypt entry 55745 pid18225 pkcs11_softtoken_extra.so.1 yf_aes256_load_keys_for_encrypt entry 55746 pid18225 pkcs11_softtoken_extra.so.1 yf_aes128_ecb_encrypt entry 55747 pid18225 pkcs11_softtoken_extra.so.1 yf_aes192_ecb_encrypt entry 55748 pid18225 pkcs11_softtoken_extra.so.1 yf_aes256_ecb_encrypt entry 55749 pid18225 pkcs11_softtoken_extra.so.1 yf_aes128_cbc_encrypt entry 55750 pid18225 pkcs11_softtoken_extra.so.1 yf_aes192_cbc_encrypt entry 55751 pid18225 pkcs11_softtoken_extra.so.1 yf_aes256_cbc_encrypt entry 55752 pid18225 pkcs11_softtoken_extra.so.1 yf_aes128_ctr_crypt entry 55753 pid18225 pkcs11_softtoken_extra.so.1 yf_aes192_ctr_crypt entry 55754 pid18225 pkcs11_softtoken_extra.so.1 yf_aes256_ctr_crypt entry 55755 pid18225 pkcs11_softtoken_extra.so.1 yf_aes128_cfb128_encrypt entry 55756 pid18225 pkcs11_softtoken_extra.so.1 yf_aes192_cfb128_encrypt entry 55757 pid18225 pkcs11_softtoken_extra.so.1 yf_aes256_cfb128_encrypt entry 55758 pid18225 pkcs11_softtoken_extra.so.1 yf_aes128_load_keys_for_decrypt entry 55759 pid18225 pkcs11_softtoken_extra.so.1 yf_aes192_load_keys_for_decrypt entry 55760 pid18225 pkcs11_softtoken_extra.so.1 yf_aes256_load_keys_for_decrypt entry 55761 pid18225 pkcs11_softtoken_extra.so.1 yf_aes128_ecb_decrypt entry 55762 pid18225 pkcs11_softtoken_extra.so.1 yf_aes192_ecb_decrypt entry 55763 pid18225 pkcs11_softtoken_extra.so.1 yf_aes256_ecb_decrypt entry 55764 pid18225 pkcs11_softtoken_extra.so.1 yf_aes128_cbc_decrypt entry 55765 pid18225 pkcs11_softtoken_extra.so.1 yf_aes192_cbc_decrypt entry 55766 pid18225 pkcs11_softtoken_extra.so.1 yf_aes256_cbc_decrypt entry 55767 pid18225 pkcs11_softtoken_extra.so.1 yf_aes128_cfb128_decrypt entry 55768 pid18225 pkcs11_softtoken_extra.so.1 yf_aes192_cfb128_decrypt entry 55769 pid18225 pkcs11_softtoken_extra.so.1 yf_aes256_cfb128_decrypt entry 55771 pid18225 pkcs11_softtoken_extra.so.1 yf_des_instructions_present entry 55772 pid18225 pkcs11_softtoken_extra.so.1 yf_des_expand entry 55773 pid18225 pkcs11_softtoken_extra.so.1 yf_des_encrypt entry 55774 pid18225 pkcs11_softtoken_extra.so.1 yf_mpmul_present entry 55775 pid18225 pkcs11_softtoken_extra.so.1 yf_montmul_present entry 55776 pid18225 pkcs11_softtoken_extra.so.1 mm_yf_montmul entry 55777 pid18225 pkcs11_softtoken_extra.so.1 mm_yf_montsqr entry 55778 pid18225 pkcs11_softtoken_extra.so.1 mm_yf_restore_func entry 55779 pid18225 pkcs11_softtoken_extra.so.1 mm_yf_ret_from_mont_func entry 55780 pid18225 pkcs11_softtoken_extra.so.1 mm_yf_execute_slp entry 55781 pid18225 pkcs11_softtoken_extra.so.1 big_modexp_ncp_yf entry 55782 pid18225 pkcs11_softtoken_extra.so.1 big_mont_mul_yf entry 55783 pid18225 pkcs11_softtoken_extra.so.1 mpmul_arr_yf entry 55784 pid18225 pkcs11_softtoken_extra.so.1 big_mp_mul_yf entry 55785 pid18225 pkcs11_softtoken_extra.so.1 mpm_yf_mpmul entry 55786 pid18225 libns-httpd40.so nsapi_rsa_set_priv_fn entry ... 55795 pid18225 libnss3.so prepare_rsa_priv_key_export_for_asn1 entry 55796 pid18225 libresolv.so.2 sunw_dst_rsaref_init entry 55797 pid18225 libnssutil3.so NSS_Get_SEC_UniversalStringTemplate entry ... 55813 pid18225 libsoftokn3.so prepare_low_rsa_priv_key_for_asn1 entry 55814 pid18225 libsoftokn3.so rsa_FormatOneBlock entry 55815 pid18225 libsoftokn3.so rsa_FormatBlock entry 55816 pid18225 libnssdbm3.so lg_prepare_low_rsa_priv_key_for_asn1 entry 55817 pid18225 libfreebl_32fpu_3.so rsa_build_from_primes entry 55818 pid18225 libfreebl_32fpu_3.so rsa_is_prime entry 55819 pid18225 libfreebl_32fpu_3.so rsa_get_primes_from_exponents entry 55820 pid18225 libfreebl_32fpu_3.so rsa_PrivateKeyOpNoCRT entry 55821 pid18225 libfreebl_32fpu_3.so rsa_PrivateKeyOpCRTNoCheck entry 55822 pid18225 libfreebl_32fpu_3.so rsa_PrivateKeyOpCRTCheckedPubKey entry 55823 pid18225 pkcs11_kernel.so.1 key_gen_rsa_by_value entry 55824 pid18225 pkcs11_kernel.so.1 get_rsa_private_key entry 55825 pid18225 pkcs11_kernel.so.1 get_rsa_public_key entry 55826 pid18225 pkcs11_softtoken_extra.so.1 soft_rsa_encrypt entry 55827 pid18225 pkcs11_softtoken_extra.so.1 soft_rsa_decrypt entry 55828 pid18225 pkcs11_softtoken_extra.so.1 soft_rsa_crypt_init_common entry 55829 pid18225 pkcs11_softtoken_extra.so.1 soft_rsa_encrypt_common entry 55830 pid18225 pkcs11_softtoken_extra.so.1 soft_rsa_decrypt_common entry 55831 pid18225 pkcs11_softtoken_extra.so.1 soft_rsa_sign_verify_init_common entry 55832 pid18225 pkcs11_softtoken_extra.so.1 soft_rsa_sign_common entry 55833 pid18225 pkcs11_softtoken_extra.so.1 soft_rsa_verify_common entry 55834 pid18225 pkcs11_softtoken_extra.so.1 generate_rsa_key entry 55835 pid18225 pkcs11_softtoken_extra.so.1 soft_rsa_genkey_pair entry 55836 pid18225 pkcs11_softtoken_extra.so.1 get_rsa_sha1_prefix entry 55837 pid18225 pkcs11_softtoken_extra.so.1 soft_rsa_digest_sign_common entry 55838 pid18225 pkcs11_softtoken_extra.so.1 soft_rsa_digest_verify_common entry 55839 pid18225 pkcs11_softtoken_extra.so.1 soft_rsa_verify_recover entry 55840 pid18225 pkcs11_softtoken_extra.so.1 rsa_pri_to_asn1 entry 55841 pid18225 pkcs11_softtoken_extra.so.1 asn1_to_rsa_pri entry 55842 pid18225 pkcs11_softtoken_extra.so.1 soft_encrypt_rsa_pkcs_encode entry 55843 pid18225 pkcs11_softtoken_extra.so.1 soft_decrypt_rsa_pkcs_decode entry 55844 pid18225 pkcs11_softtoken_extra.so.1 soft_sign_rsa_pkcs_encode entry 55845 pid18225 pkcs11_softtoken_extra.so.1 soft_verify_rsa_pkcs_decode entry 55770 profile tick-1sec

    Read the article

  • CodePlex Daily Summary for Wednesday, May 26, 2010

    CodePlex Daily Summary for Wednesday, May 26, 2010New Projects3D File Manager: 3D File manager is an application that aims to show how could look file manager in 3D. It´s developed in C# and XNA frameworkAcies: Acies is a dungeon crawler game done with C# and XNA.ActiveWinery: The open source winery and vineyard application.CC.Yacht: CC.Yacht is a client/server yacht dice game written in C# .NET. It utilizes a net.tcp WCF duplex service for client/server communication.Community Forums NNTP bridge: Community project for accessing the MS Web-Forums via an open source NNTP newsserver (bridge).Dojo Timer: WPF timer for Coding Dojo meetings. Timer feito em WPF para Coding Dojo.GameFX - The Game Development Framework: The Game Development Framework (GameFX) is simply a set of libraries to be used as the foundation for any simple 2D tile-based game. It can be used...Greg Roberts MVC Extensions: Asp.Net MVC Extensions including JSONP ActionResult. Targeted for MVC 2 and .NET 4.0.IIS Deploy: Project to develop a tool that automates the deploy Web sites and WCF services in single server environments and clustered.MarkLogic Sample Authoring App for Word: The MarkLogic Authoring Sample App for Word lets authors enrich Word documents using Content Controls, associate and manage metadata with those Con...Mono.Addins: Mono.Addins is a framework for creating extensible applications, and for creating add-ins which extend applications.MPCLI: MPCLI is a library that brings the power of the GNU MP big numbers library to those who use CLS-compliant languages such as C#, F#, and Visual Basi...NTFS parser classes: This is a C++ library to help parsing an NTFS volume, as well as file records and attributes. It will facilitate much when handling NTFS filesystem...Oddworld Level Gen: A 2D platform game, with Oddworld : Abe's Oddysee asset. The game introduce a dynamic system to generate the next level according to the previous l...Page Action Web Part for SharePoint 2007: This Web Part for SharePoint 2007 allows you to perform actions (such as causing an "Access is denied", redirect to another web page, view content ...Piggy Bank: Piggy Bank is a web-based financial application targetted towards kids.Productivity Hub Solutions: The Productivity Hub 2010 is a customizable, on-premise training solution for technology products. Developed by RedTech for Microsoft, the Producti...PyQt port of TortoiseHg: PyQt port of TortoiseHg (aka TortoiseHg 2.0)Releaser™: This is my private project. Currently, I'm not going to support it publicly.SLManagers: SLManagers 用于动态加载组件 实现对程序不同的的管理Smith Async .NET Memcached Client: Async .NET Memcached Client is a fully asynchronous implementation of a memcached client. The advantage of a fully asynchronous client is that you...Tauck Public API: Tauck's public API allows for travel agencies and other parterners to use Tauck's product information in their websites and other systems. Virtualizing WrapPanel: Virtualizing WrapPanel improves performance when binding a ListBox/ListView to a large amount of data. It is written in C#New Releases3D File Manager: 3D File manager: 3D File managerAragon Online Client: Aragon Online Client: The executable version of the Aragon Online Client can be installed from the Aragon Online page: http://aragon-online.net/aoclient/publish.phpASP.NET MVC CMS ( Using CommonLibrary.NET ): CommonLibrary CMS Alpha 2: CommonLibrary CMSA simple yet powerful CMS system in ASP.NET MVC 2 using C# 3.5. ActiveRecord based support for Blogs, Widgets, Pages, Parts, Ev...BFBC2 PRoCon: PRoCon 0.5.1.8: It's not even funny anymore =\Code for Rapid C# Windows Development eBook: LLBLGen LINQPad Data Context Driver Ver 1.0.0.3: Second release of a Static LLBLGen Pro Data Context Driver for LINQPad For LLBLGen Pro versions 2.6 and 3.0 beta. Fixed 'connection string not ini...Community Forums NNTP bridge: Community Forums NNTP Bridge V01: This is the first release of the Community Forums NNTP Bridge to access the social and anwsers MS forums with a single, open sourcen NNTP Bridge.Community Forums NNTP bridge: Community Forums NNTP Bridge V02: This is the second release of the Community Forums NNTP Bridge to access the social and anwsers MS forums with a single, open sourcen NNTP Bridge. ...DDDSample.Net: 0.9: Release 0.9 contains two major improvements: Vanilla version (both Synch and Asynch) has been updated so its model more closely resembles Java orig...DEWD: DEWD for Umbraco: Alpha release of the package. Usable for simple SQL editing, but lacking some core features such as validation, user friendly error handling, confi...Dojo Timer: Dojo Timer v1: Primeira versão Dojo Timer.eXpress Persistent Objects (XPO) Toolkit: Samples: Video Channel Channel.zip sample shows how to build a video site using XPO and WCF Data Services. DevExpress Channel DevExpress Channel Browse ...F# Project Extender: V0.9.2.1 (VS2008,VS2010): F# project extender for Visual Studio 2008 and Visual Studio 2010. Fixed bugs: -Project extender 0.9.2.0 can't be loaded in VS2008 without SDKFeedback Form: Feedback Application: Installer of the projectFeedback Form: Feedback Form: .sln for Feedback Form ApplicationGameFX - The Game Development Framework: Version 1.0 (Beta): Project is Visual Studio 2008 solution. GameFX Source code and sample program. The sample program allows you to create maps of any size, and drop ...MarkLogic Sample Authoring App for Word: MarkLogic Sample Authoring App for Word 1.0-1: Initial release of the MarkLogic Sample Authoring App for Word. See the home page for an overview on functionality. Within the release you'll ...MarkLogic Toolkit for Word: MarkLogic Toolkit for Word 1.2-1: Release built in support of the MarkLogic Sample Authoring App for Word. Updates include: update to XQuery API to expose functions for working w...Microsoft SQL Server Community & Samples: SQL Server 2008R2 RTM: Microsoft SQL Server 2008R2 (RTM) This release contains sample code for Microsoft SQL Server 2008R2. For many of these samples you will also need...Microsoft SQL Server Product Samples: Analysis Services: SQL Server 2008R2 RTM: Microsoft SQL Server 2008R2 (RTM) This release is dedicated to the samples that ship for Microsoft SQL Server 2008R2. For many of these samples y...Microsoft SQL Server Product Samples: Data Programming: SQL Server 2008R2 RTM: Microsoft SQL Server 2008R2 (RTM) This release is dedicated to the samples that ship for Microsoft SQL Server 2008R2. For many of these samples y...Microsoft SQL Server Product Samples: Database: AdventureWorks 2008R2 RTM: Sample Databases for Microsoft SQL Server 2008R2 (RTM)This release is dedicated to the sample databases that ship for Microsoft SQL Server 2008R2. ...Microsoft SQL Server Product Samples: End to End: SQL Server 2008R2 RTM: Microsoft SQL Server 2008R2 (RTM) This release is dedicated to the samples that ship for Microsoft SQL Server 2008R2. For many of these samples y...Microsoft SQL Server Product Samples: Engine: SQL Server 2008R2 RTM: Microsoft SQL Server 2008R2 (RTM) This release is dedicated to the samples that ship for Microsoft SQL Server 2008R2. For many of these samples y...Microsoft SQL Server Product Samples: Integration Services: SQL Server 2008R2 RTM: Microsoft SQL Server 2008R2 (RTM) This release is dedicated to the samples that ship for Microsoft SQL Server 2008R2. For many of these samples y...Microsoft SQL Server Product Samples: Replication: SQL Server 2008R2 RTM: Microsoft SQL Server 2008R2 (RTM) This release is dedicated to the samples that ship for Microsoft SQL Server 2008R2. For many of these samples y...Microsoft SQL Server Product Samples: Reporting Services: SQL Server 2008R2 RTM: Microsoft SQL Server 2008R2 (RTM) This release is dedicated to the samples that ship for Microsoft SQL Server 2008R2. For many of these samples y...Microsoft SQL Server Product Samples: Scripts: SQL Server 2008R2 RTM: Microsoft SQL Server 2008R2 (RTM) This release is dedicated to the samples that ship for Microsoft SQL Server 2008R2. For many of these samples y...Microsoft SQL Server Product Samples: Service Broker: SQL Server 2008R2 RTM: Microsoft SQL Server 2008R2 (RTM) This release is dedicated to the samples that ship for Microsoft SQL Server 2008R2. For many of these samples y...Microsoft SQL Server Product Samples: XML: SQL Server 2008R2 RTM: Microsoft SQL Server 2008R2 (RTM) This release is dedicated to the samples that ship for Microsoft SQL Server 2008R2. For many of these samples y...NLog - Advanced .NET Logging: Nightly Build 2010.05.25.001: Changes since the last build:2010-05-24 23:08:47 Jarek Kowalski Fixed base constructor invocation to ensure consistency. Added tests for common wra...NTFS parser classes: NTFS parser lib 0.55: 0.55openrs: Revision 3: Things that have been added since last release: Vector expanding Dynamic vectors Vector put method chaining Basic ISAAC implementation Wor...Page Action Web Part for SharePoint 2007: Page Action Web Part v1.0.0.0: First release of the Page Action Web Part v1.0.0.0.Productivity Hub Solutions: Silverlight Bookshelf: The Silverlight Bookshelf component of the 2010 Productivity Hub provides 4 accordion-style vertical tabs dispalying Featured Video, Featured Conte...Productivity Hub Solutions: Silverlight Product Carousel: The Product Carousel Silverlight component provides a rich navigation experience to the home page of the 2010 Productivity Hub - presenting the pro...Rawr: Rawr 2.3.18: >Rawr3 Public Beta has been released! Click here for details.< - Fix for bug in parsing characters with certain abnormal characters in their data. ...Runtime Intelligence Data Visualizer: RI Data Visualizer Release 1: This release of the RI Data Visualizer contains both a WPF client that displays application usage data and a Silverlight client that displays featu...sGSHOPedit: sGSHOPedit v1.1a: Fixed: bug in parsing description from "itemextdesc.txt" Fixed: surface change event Fixed: range for numeric values Added: search featureSLManagers: SlManagers: 实现简单的组件动态下载 使用Mef技术Sudoku (Multiplayer in RnD): Sudoku (Multiplayer in RnD) 1.0.1.0 program: Sudoku project was to practice on C# by making a desktop application using some algorithm Idea: The basic idea of algorithm is from http://www.ac...Sudoku (Multiplayer in RnD): Sudoku (Multiplayer in RnD) 1.0.1.0 source: user-interface, multi-threading, formatting Sudoku project was to practice on C# by making a desktop application using some algorithm Idea: The...Tauck Public API: XML Package 1.0: Current Release of XML dataTeach.Net: Teach.Net 1.0 Alpha: First alpha version. It should work, but there's gonna be bugs. Also, no intellisense documentation (or any other sort of documentation) yet. I'm w...VCC: Latest build, v2.1.30525.0: Automatic drop of latest buildVista Media Center TCP/IP Controller: Win7 64 and 32 bit Alpha - button command fix: button command fix , button-play, button-pause, button-skip back, button-skip fwd. Confirmed works on x64. Has not been tested on x32XsltDb - DotNetNuke Module Universal Building Block: 01.01.21: ASP.NET controls TreeView and TextEditor usage Live demo site Attention This release requires DNN 5.2 or higher as it using Telerik classes.in...Most Popular ProjectsRawrWBFS ManagerAJAX Control ToolkitMicrosoft SQL Server Product Samples: DatabaseSilverlight ToolkitWindows Presentation Foundation (WPF)patterns & practices – Enterprise LibraryMicrosoft SQL Server Community & SamplesPHPExcelASP.NETMost Active ProjectsAStar.netpatterns & practices – Enterprise Librarypatterns & practices: Windows Azure Security GuidanceRawrSqlServerExtensionsMono.AddinsBlogEngine.NETGMap.NET - Great Maps for Windows Forms & PresentationCodeReviewCaliburn: An Application Framework for WPF and Silverlight

    Read the article

  • Generating EF Code First model classes from an existing database

    - by Jon Galloway
    Entity Framework Code First is a lightweight way to "turn on" data access for a simple CLR class. As the name implies, the intended use is that you're writing the code first and thinking about the database later. However, I really like the Entity Framework Code First works, and I want to use it in existing projects and projects with pre-existing databases. For example, MVC Music Store comes with a SQL Express database that's pre-loaded with a catalog of music (including genres, artists, and songs), and while it may eventually make sense to load that seed data from a different source, for the MVC 3 release we wanted to keep using the existing database. While I'm not getting the full benefit of Code First - writing code which drives the database schema - I can still benefit from the simplicity of the lightweight code approach. Scott Guthrie blogged about how to use entity framework with an existing database, looking at how you can override the Entity Framework Code First conventions so that it can work with a database which was created following other conventions. That gives you the information you need to create the model classes manually. However, it turns out that with Entity Framework 4 CTP 5, there's a way to generate the model classes from the database schema. Once the grunt work is done, of course, you can go in and modify the model classes as you'd like, but you can save the time and frustration of figuring out things like mapping SQL database types to .NET types. Note that this template requires Entity Framework 4 CTP 5 or later. You can install EF 4 CTP 5 here. Step One: Generate an EF Model from your existing database The code generation system in Entity Framework works from a model. You can add a model to your existing project and delete it when you're done, but I think it's simpler to just spin up a separate project to generate the model classes. When you're done, you can delete the project without affecting your application, or you may choose to keep it around in case you have other database schema updates which require model changes. I chose to add the Model classes to the Models folder of a new MVC 3 application. Right-click the folder and select "Add / New Item..."   Next, select ADO.NET Entity Data Model from the Data Templates list, and name it whatever you want (the name is unimportant).   Next, select "Generate from database." This is important - it's what kicks off the next few steps, which read your database's schema.   Now it's time to point the Entity Data Model Wizard at your existing database. I'll assume you know how to find your database - if not, I covered that a bit in the MVC Music Store tutorial section on Models and Data. Select your database, uncheck the "Save entity connection settings in Web.config" (since we won't be using them within the application), and click Next.   Now you can select the database objects you'd like modeled. I just selected all tables and clicked Finish.   And there's your model. If you want, you can make additional changes here before going on to generate the code.   Step Two: Add the DbContext Generator Like most code generation systems in Visual Studio lately, Entity Framework uses T4 templates which allow for some control over how the code is generated. K Scott Allen wrote a detailed article on T4 Templates and the Entity Framework on MSDN recently, if you'd like to know more. Fortunately for us, there's already a template that does just what we need without any customization. Right-click a blank space in the Entity Framework model surface and select "Add Code Generation Item..." Select the Code groupt in the Installed Templates section and pick the ADO.NET DbContext Generator. If you don't see this listed, make sure you've got EF 4 CTP 5 installed and that you're looking at the Code templates group. Note that the DbContext Generator template is similar to the EF POCO template which came out last year, but with "fix up" code (unnecessary in EF Code First) removed.   As soon as you do this, you'll two terrifying Security Warnings - unless you click the "Do not show this message again" checkbox the first time. It will also be displayed (twice) every time you rebuild the project, so I checked the box and no immediate harm befell my computer (fingers crossed!).   Here's the payoff: two templates (filenames ending with .tt) have been added to the project, and they've generated the code I needed.   The "MusicStoreEntities.Context.tt" template built a DbContext class which holds the entity collections, and the "MusicStoreEntities.tt" template build a separate class for each table I selected earlier. We'll customize them in the next step. I recommend copying all the generated .cs files into your application at this point, since accidentally rebuilding the generation project will overwrite your changes if you leave them there. Step Three: Modify and use your POCO entity classes Note: I made a bunch of tweaks to my POCO classes after they were generated. You don't have to do any of this, but I think it's important that you can - they're your classes, and EF Code First respects that. Modify them as you need for your application, or don't. The Context class derives from DbContext, which is what turns on the EF Code First features. It holds a DbSet for each entity. Think of DbSet as a simple List, but with Entity Framework features turned on.   //------------------------------------------------------------------------------ // <auto-generated> // This code was generated from a template. // // Changes to this file may cause incorrect behavior and will be lost if // the code is regenerated. // </auto-generated> //------------------------------------------------------------------------------ namespace EF_CodeFirst_From_Existing_Database.Models { using System; using System.Data.Entity; public partial class Entities : DbContext { public Entities() : base("name=Entities") { } public DbSet<Album> Albums { get; set; } public DbSet<Artist> Artists { get; set; } public DbSet<Cart> Carts { get; set; } public DbSet<Genre> Genres { get; set; } public DbSet<OrderDetail> OrderDetails { get; set; } public DbSet<Order> Orders { get; set; } } } It's a pretty lightweight class as generated, so I just took out the comments, set the namespace, removed the constructor, and formatted it a bit. Done. If I wanted, though, I could have added or removed DbSets, overridden conventions, etc. using System.Data.Entity; namespace MvcMusicStore.Models { public class MusicStoreEntities : DbContext { public DbSet Albums { get; set; } public DbSet Genres { get; set; } public DbSet Artists { get; set; } public DbSet Carts { get; set; } public DbSet Orders { get; set; } public DbSet OrderDetails { get; set; } } } Next, it's time to look at the individual classes. Some of mine were pretty simple - for the Cart class, I just need to remove the header and clean up the namespace. //------------------------------------------------------------------------------ // // This code was generated from a template. // // Changes to this file may cause incorrect behavior and will be lost if // the code is regenerated. // //------------------------------------------------------------------------------ namespace EF_CodeFirst_From_Existing_Database.Models { using System; using System.Collections.Generic; public partial class Cart { // Primitive properties public int RecordId { get; set; } public string CartId { get; set; } public int AlbumId { get; set; } public int Count { get; set; } public System.DateTime DateCreated { get; set; } // Navigation properties public virtual Album Album { get; set; } } } I did a bit more customization on the Album class. Here's what was generated: //------------------------------------------------------------------------------ // // This code was generated from a template. // // Changes to this file may cause incorrect behavior and will be lost if // the code is regenerated. // //------------------------------------------------------------------------------ namespace EF_CodeFirst_From_Existing_Database.Models { using System; using System.Collections.Generic; public partial class Album { public Album() { this.Carts = new HashSet(); this.OrderDetails = new HashSet(); } // Primitive properties public int AlbumId { get; set; } public int GenreId { get; set; } public int ArtistId { get; set; } public string Title { get; set; } public decimal Price { get; set; } public string AlbumArtUrl { get; set; } // Navigation properties public virtual Artist Artist { get; set; } public virtual Genre Genre { get; set; } public virtual ICollection Carts { get; set; } public virtual ICollection OrderDetails { get; set; } } } I removed the header, changed the namespace, and removed some of the navigation properties. One nice thing about EF Code First is that you don't have to have a property for each database column or foreign key. In the Music Store sample, for instance, we build the app up using code first and start with just a few columns, adding in fields and navigation properties as the application needs them. EF Code First handles the columsn we've told it about and doesn't complain about the others. Here's the basic class: using System.ComponentModel; using System.ComponentModel.DataAnnotations; using System.Web.Mvc; using System.Collections.Generic; namespace MvcMusicStore.Models { public class Album { public int AlbumId { get; set; } public int GenreId { get; set; } public int ArtistId { get; set; } public string Title { get; set; } public decimal Price { get; set; } public string AlbumArtUrl { get; set; } public virtual Genre Genre { get; set; } public virtual Artist Artist { get; set; } public virtual List OrderDetails { get; set; } } } It's my class, not Entity Framework's, so I'm free to do what I want with it. I added a bunch of MVC 3 annotations for scaffolding and validation support, as shown below: using System.ComponentModel; using System.ComponentModel.DataAnnotations; using System.Web.Mvc; using System.Collections.Generic; namespace MvcMusicStore.Models { [Bind(Exclude = "AlbumId")] public class Album { [ScaffoldColumn(false)] public int AlbumId { get; set; } [DisplayName("Genre")] public int GenreId { get; set; } [DisplayName("Artist")] public int ArtistId { get; set; } [Required(ErrorMessage = "An Album Title is required")] [StringLength(160)] public string Title { get; set; } [Required(ErrorMessage = "Price is required")] [Range(0.01, 100.00, ErrorMessage = "Price must be between 0.01 and 100.00")] public decimal Price { get; set; } [DisplayName("Album Art URL")] [StringLength(1024)] public string AlbumArtUrl { get; set; } public virtual Genre Genre { get; set; } public virtual Artist Artist { get; set; } public virtual List<OrderDetail> OrderDetails { get; set; } } } The end result was that I had working EF Code First model code for the finished application. You can follow along through the tutorial to see how I built up to the finished model classes, starting with simple 2-3 property classes and building up to the full working schema. Thanks to Diego Vega (on the Entity Framework team) for pointing me to the DbContext template.

    Read the article

  • CodePlex Daily Summary for Saturday, March 17, 2012

    CodePlex Daily Summary for Saturday, March 17, 2012Popular ReleasesAutoSPEditor: AutoSPEditor Installer: Installs the current version of the AutoSPEditor using ClickOnce. Your application will be updated automatically.Javascript .NET: Javascript .NET v0.6: Upgraded to the latest stable branch of v8 (/tags/3.9.18), and switched to using their scons build system. We no longer include v8 source code as part of this project's source code. Simultaneous multithreaded use of v8 now supported (v8 Isolates), although different contexts may not share objects or call each other. 64-bit .Net 4.0 DLL now included. (Download now includes x86 and x64 for both .Net 3.5 and .Net 4.0.)MyRouter (Virtual WiFi Router): MyRouter 1.0.6: This release should be more stable there were a few bug fixes including the x64 issue as well as an error popping up when MyRouter started this was caused by a NULL valuePulse: Pulse Beta 4: This version is still in development but should include: Logging and error handling have been greatly improved. If you run into an error or Pulse crashes make sure to check the Log folder for a recently modified log file so you can report the details of the issue A bunch of new features for the Wallbase.cc provider. Cleaner separation between inputs, downloading and output. Input and downloading are fairly clean now but outputs are still mixed up in the mix which I'm trying to resolve ...Google Books Downloader for Windows: Google Books Downloader-2.0.0.0.: Google Books DownloaderFinestra Virtual Desktops: 2.5.4501: This is a very minor update release. Please see the information about the 2.5 and 2.5.4500 releases for more information on recent changes. This update did not even have an automatic update triggered for it. Adds error checking and reporting to all threads, not only those with message loopsAcDown????? - Anime&Comic Downloader: AcDown????? v3.9.2: ?? ●AcDown??????????、??、??????,????1M,????,????,?????????????????????????。???????????Acfun、????(Bilibili)、??、??、YouTube、??、???、??????、SF????、????????????。??????AcPlay?????,??????、????????????????。 ● AcDown???????????????????????????,???,???????????????????。 ● AcDown???????C#??,????.NET Framework 2.0??。?????"Acfun?????"。 ????32??64? Windows XP/Vista/7/8 ????????????? ??:????????Windows XP???,?????????.NET Framework 2.0???(x86),?????"?????????"??? ??????????????,??????????: ??"AcDo...ArcGIS Editor for OpenStreetMap: ArcGIS Editor for OSM 2.0 Release Candidate: Your feedback is welcome - and this is your last chance to get your fixes in for this version! Includes installer for both Feature Server extension and Desktop extension, enhanced functionality for the Desktop tools, and enhanced built-in Javascript Editor for the Feature Server component. This release candidate includes fixes to beta 4 that accommodate domain users for setting up the Server Component, and fixes for reporting/uploading references tracked in the revision table. See Code In-P...C.B.R. : Comic Book Reader: CBR 0.6: 20 Issue trackers are closed and a lot of bugs too Localize view is now MVVM and delete is working. Added the unused flag (take care that it goes to true only when displaying screen elements) Backstage - new input/output format choice control for the conversion Backstage - Add display, behaviour and register file type options in the extended options dialog Explorer list view has been transformed to a custom control. New group header, colunms order and size are saved Single insta...Windows Azure Toolkit for Windows 8: Windows Azure Toolkit for Windows 8 Consumer Prv: Windows Azure Toolkit for Windows 8 Consumer Preview - Preview Release v1.2.1Minor updates to setup experience: Check for WebPI before install Dependency Check updated to support the following VS 11 and VS 2010 SKUs Ultimate, Premium, Professional and Express Certs Windows Azure Toolkit for Windows 8 Consumer Preview - Preview Release v1.2.0 Please download this for Windows Azure Toolkit for Windows 8 functionality on Windows 8 Consumer Preview. The core features of the toolkit include:...Facebook Graph Toolkit: Facebook Graph Toolkit 3.0: ships with JSON Toolkit v3.0, offering parse speed up to 10 times of last version supports Facebook's new auth dialog supports new extend access token endpoint new example Page Tab app filter Graph Api connections using dates fixed bugs in Page Tab appsCODE Framework: 4.0.20312.0: This version includes significant improvements in the WPF system (and the WPF MVVM/MVC system). This includes new styles for Metro controls and layouts. Improved color handling. It also includes an improved theme/style swapping engine down to active (open) views. There also are various other enhancements and small fixes throughout the entire framework.ScintillaNET: ScintillaNET 2.4: 3/12/2012 Jacob Slusser Added support for annotations. Issues Fixed with this Release Issue # Title 25012 25012 25018 25018 25023 25023 25014 25014 Visual Studio ALM Quick Reference Guidance: v3 - For Visual Studio 11: RELEASE README Welcome to the BETA release of the Quick Reference Guide preview As this is a BETA release and the quality bar for the final Release has not been achieved, we value your candid feedback and recommend that you do not use or deploy these BETA artifacts in a production environment. Quality-Bar Details Documentation has been reviewed by Visual Studio ALM Rangers Documentation has not been through an independent technical review Documentation ...AvalonDock: AvalonDock 2.0.0345: Welcome to early alpha release of AvalonDock 2.0 I've completely rewritten AvalonDock in order to take full advantage of the MVVM pattern. New version also boost a lot of new features: 1) Deep separation between model and layout. 2) Full WPF binding support thanks to unified logical tree between main docking manager, auto-hide windows and floating windows. 3) Support for Aero semi-maximized windows feature. 4) Support for multiple panes in the same floating windows. For a short list of new f...Windows Azure PowerShell Cmdlets: Windows Azure PowerShell Cmdlets 2.2.2: Changes Added Start Menu Item for Easy Startup Added Link to Getting Started Document Added Ability to Persist Subscription Data to Disk Fixed Get-Deployment to not throw on empty slot Simplified numerous default values for cmdlets Breaking Changes: -SubscriptionName is now mandatory in Set-Subscription. -DefaultStorageAccountName and -DefaultStorageAccountKey parameters were removed from Set-Subscription. Instead, when adding multiple accounts to a subscription, each one needs to be added ...IronPython: 2.7.2.1: On behalf of the IronPython team, I'm happy to announce the final release IronPython 2.7.2. This release includes everything from IronPython 54498 and 62475 as well. Like all IronPython 2.7-series releases, .NET 4 is required to install it. Installing this release will replace any existing IronPython 2.7-series installation. Unlike previous releases, the assemblies for all supported platforms are included in the installer as well as the zip package, in the "Platforms" directory. IronPython 2...Kooboo CMS: Kooboo CMS 3.2.0.0: Breaking changes: When upgrade from previous versions, MUST reset the all the content type templates, otherwise the content manager might get a compile error. New features Integrate with Windows azure. See: http://wiki.kooboo.com/?wiki=Kooboo CMS on Azure Complete solution to deploy on load balance servers. See: http://wiki.kooboo.com/?wiki=Kooboo CMS load balance Update Jquery and Jquery ui to the lastest version(Jquery 1.71, Jquery UI 1.8.16). Tree style text content editing. See:h...Extensions for Reactive Extensions (Rxx): Rxx 1.3: Please read the latest release notes for details about what's new. Related Work Items Content SummaryRxx provides the following features. See the Documentation for details. Many IObservable<T> extension methods and IEnumerable<T> extension methods. Many wrappers that convert asynchronous Framework Class Library APIs into observables. Many useful types such as ListSubject<T>, DictionarySubject<T>, CommandSubject, ViewModel, ObservableDynamicObject, Either<TLeft, TRight>, Maybe<T>, Scala...Microsoft Ajax Minifier: Microsoft Ajax Minifier 4.47: Properly output escaped characters in CSS identifiers throw an EOF error when parsing a CSS selector that doesn't end in a declaration block chased down a stack-overflow issue with really large JS sources. Needed to flatten out the AST tree for adjacent expression statements that the application merges into a single expression statement, or that already contain large, comma-separated expressions in the original source. fix issue #17569: tie together the -debug switch with the DEBUG defi...New ProjectsCOBOL Bubble Sort: The famous Bubble Sort algorithm using COBOLCross Site Media Dashboard (webpart) for SharePoint video content: View video content in easy to configure dashboard structure. Supports a vast range of video formatsDataObjects.Net Samples: DataObjects.Net SamplesDev Tracking Tool: Dev Tracking Tool is thought to help software developers keeping track of their activities in a fast way, and use these informations for later update of tasks in other informatic systems (i.e.: tracking tasks in TFS and/or other tracking tools).Governance Checklist - Create/manage your Governance Checklist: Governance Checklist - Create/manage your Governance Checklist. SharePoint Governance Checklist generated as a starting point however you can update to your own criteria. Visual Guages to show the status of your governance progresslibspotify.NET - a managed interop library for libspotify: libspotify.NET is a simple interop wrapper library for libspotify written in C#. It enables .NET developers to write applications that can browse, search, and stream digital music from the Spotify platform. This project is compatible with libspotify API version 10.1.16. Requires libspotify.dll, a Spotify premium account, and an application API key. More info on the libspotify API at http://developer.spotify.com/en/libspotify/overview/ Media.Net: All-In-One Media Player for Windows XP, Vista, Windows 7 and Windows 8.MisTutoriales: MisTutoriales Ejemplos de programación.mobile video: mobile videoMugen MVVM Toolkit: Mugen MVVM Toolkit makes it easier to develop Silverlight, WPF and WP7 applications using the Model-View-ViewModel design pattern. It is very easy to use. It combines the pattern of DI (dependency injection) and MVVM (Model-View-ViewModel).Project SIMPLE Orca: SIMPLE makes it easier for non-programmers to create simple applications. You'll no longer have to learn an entire language in order to create that simple application you've been stressing over. S imple And I deal M acroed P rogramming L anguage For E aseRelational data Transfer Application: Relational data Transfer Application helps to move data scenarios from One relational database to other Relational database without moving entire tables. Basically this applciation moves a given data row along with its relational data to destination database. rGUI: rGUI is an open source .NET front-end for Robocopy written in C#. It is designed for users already familiar with the Robocopy command-line switches and what they do, but who want the command parameters presented in a visual way.S#: Spontaneous Sharp: A mix of very easy to use c# libraries, born out of spontaneous needs to simplify a large number of programming tasks in .NET C#.SDF - Smart Document Framework: Smart Document Framework offers an alternative to the FlowDocument which provides a simpler rendering engine and allows for better pagination.Shatong: Shatong SystemSPD 2007 Custom Activity Pause Until CheckOut = None: This activity can be added to the beginning of a SharePoint Designer 2007 workflow to pause the workflow until the item is no longer checked out (soft or hard checkout)Spontaneous XNA Library (SXL): Yet another XNA Library that has arisen from various spontaneous needs. Hopefully it can be as helpful to others as it has been to me.TeaFiles.Net - time series storage: TeaFiles.Net is an API for TeaFiles that makes it easy to store and exchange time series data between .Net programs and C++, Python, R and other applications. Written in C#. Find more details about the file specification, other APIs and tools at discretelogics.comTypePipe: TypePipe allows you to modify existing CLR types using a simple, expression-based API. Modifications from several tools and libraries (AOP, IoC etc.) can be combined. Types are generated via Reflection.Emit or user-defined back-ends.UCB_12_1: This is a research projectUCB_12_2: UCB_12_2UCB_12_3: UCB_12_3UCB_12_4: UCB_12_4UCB_12_Common: UCB_12_CommonVG Task Manager: task manager demo project created by Vivek Gupta.WP7 Map Control: Lightweight universal map control for Windows Phone 7xRM#: xRM# is focused on boosting productivity while developing JavaScript customizations for Microsoft Dynamics CRM 2011. xRM# provided a platform to scribble code in C# and translating them to JavaScript snippet for xRM customizations. Developers can benefit from structural and coherent flow of using existing C# coding guidelines. Pseudo data types, IntelliSense and Code Documentation reduce code look-ups and increase productivity. xRM# also supports deploying generated scripts as minified ver...

    Read the article

  • CodePlex Daily Summary for Saturday, October 22, 2011

    CodePlex Daily Summary for Saturday, October 22, 2011Popular ReleasesWatchersNET CKEditor™ Provider for DotNetNuke®: CKEditor Provider 1.12.17: Changes Added FilePath Length Check when Uploading Files. Fixed Issue #6550 Fixed Issue #6536 Fixed Issue #6525 Fixed Issue #6500 Fixed Issue #6401 Fixed Issue #6490DotNet.Framework.Common: DotNet.Framework.Common 4.0: ??????????,????????????XML Explorer: XML Explorer 4.0.5: Changes in 4.0.5: Added 'Copy Attribute XPath to Address Bar' feature. Added methods for decoding node text and value from Base64 encoded strings, and copying them to the clipboard. Added 'ChildNodeDefinitions' to the options, which allows for easier navigation of parent-child and ID-IDREF relationships. Discovery happens on-demand, as nodes are expanded and child nodes are added. Nodes can now have 'virtual' child nodes, defined by an xpath to select an identifier (usually relative to ...Media Companion: MC 3.419b Weekly: A couple of minor bug fixes, but the important fix in this release is to tackle the extremely long load times for users with large TV collections (issue #130). A note has been provided by developer Playos: "One final note, you will have to suffer one final long load and then it should be fixed... alternatively you can delete the TvCache.xml and rebuild your library... The fix was to include the file extension so it doesn't have to look for the video file (checking to see if a file exists is a...CODE Framework: 4.0.11021.0: This build adds a lot of our WPF components, including our MVVC and MVC components as well as a "Metro" and "Battleship" style.Manejo de tags - PHP sobre apache: tagqrphp: Primera version: Para que funcione el programa se debe primero obtener un id para desarrollo del tag eso lo entrega Microsoft registrandose en el sitio. http://tag.microsoft.com - En tagm.php que llama a la libreria Microsoft Tag PHP Library (Codigo que sirve para trabajar con PHP y Tag) - Llenamos los datos del formulario y ejecutamos para obtener el codigo tag de microsoft el cual apunte a la url que le indicamos en el formulario - Libreria MStag.php (tiene mucha explicación del funciona...GridLibre para Visual FoxPro: GridLibre para Visual FoxPro v3.5: GridLibre Para Visual FoxPro: esta herramienta ayudara a los usuarios y programadores en los manejos de los datos, como Filtrar, multiseleccion y el autoformato a las columnas como la asignacion del controlsource.Self-Tracking Entity Generator for WPF and Silverlight: Self-Tracking Entity Generator v 0.9.9: Self-Tracking Entity Generator v 0.9.9 for Entity Framework 4.0Umbraco CMS: Umbraco 5.0 CMS Alpha 3: Umbraco 5 Alpha 3Umbraco 5 (aka Jupiter) will be the next version of everyone's favourite, friendly ASP.NET CMS that already powers over 100,000 websites worldwide. Try out the Alpha of v5 today! If you're new to Umbraco and would like to get a low-down on our popular and easy-to-learn approach to content management, check out our intro video. What's Alpha 3?This is our third Alpha release. It's intended for developers looking to become familiar with the codebase & architecture, or for thos...Vkontakte WP: Vkontakte: source codeWay2Sms Applications for Android, Desktop/Laptop & Java enabled phones: Way2SMS Desktop App v2.0: 1. Fixed issue with sending messages due to changes to Way2Sms site 2. Updated the character limit to 160 from 140GART - Geo Augmented Reality Toolkit: 1.0.1: About Release 1.0.1 Release 1.0.1 is a service release that addresses several issues and improves performance. As always, check the Documentation tab for instructions on how to get started. If you don't have the Windows Phone SDK yet, grab it here. Breaking Change Please note: There is a breaking change in this release. As noted below, the WorldCalculationMode property of ARItem has been replaced by a user-definable function. ARItem is now automatically wired up with a function that perform...Microsoft Ajax Minifier: Microsoft Ajax Minifier 4.32: Fix for issue #16710 - string literals in "constant literal operations" which contain ASP.NET substitutions should not be considered "constant." Move the JS1284 error (Misplaced Function Declaration) so it only fires when in strict mode. I got a couple complaints that people didn't like that error popping up in their existing code when they could verify that the location of that function, although not strict JS, still functions as expected cross-browser.Naked Objects: Naked Objects Release 4.0.110.0: Corresponds to the packaged version 4.0.110.0 available via NuGet. Please note that the easiest way to install and run the Naked Objects Framework is via the NuGet package manager: just search the Official NuGet Package Source for 'nakedobjects'. It is only necessary to download the source code (from here) if you wish to modify or re-build the framework yourself. If you do wish to re-build the framework, consul the file HowToBuild.txt in the release. Documentation Please note that after ...myCollections: Version 1.5: New in this version : Added edit type for selected elements Added clean for selected elements Added Amazon Italia Added Amazon China Added TVDB Italia Added TVDB China Added Turkish language You can now manually add artist Added Order by Rating Improved Add by Media Improved Artist Detail Upgrade Sqlite engine View, Zoom, Grouping, Filter are now saved by category Added group by Artist Added CubeCover View BugFixingFacebook C# SDK: 5.3: This is a BETA release which adds new features and bug fixes to v5.2.1. removed dependency from Code Contracts enabled Task Parallel Support in .NET 4.0+ added support for early preview for .NET 4.5 added additional method overloads for .NET 4.5 to support IProgress<T> for upload progress added new CS-WinForms-AsyncAwait.sln sample demonstrating the use of async/await, upload progress report using IProgress<T> and cancellation support Query/QueryAsync methods uses graph api instead...IronPython: 2.7.1 RC: This is the first release candidate of IronPython 2.7.1. Like IronPython 54498, this release requires .NET 4 or Silverlight 4. This release will replace any existing IronPython installation. If there are no showstopping issues, this will be the only release candidate for 2.7.1, so please speak up if you run into any roadblocks. The highlights of 2.7.1 are: Updated the standard library to match CPython 2.7.2. Add the ast, csv, and unicodedata modules. Fixed several bugs. IronPython To...Rawr: Rawr 4.2.6: This is the Downloadable WPF version of Rawr!For web-based version see http://elitistjerks.com/rawr.php You can find the version notes at: http://rawr.codeplex.com/wikipage?title=VersionNotes Rawr AddonWe now have a Rawr Official Addon for in-game exporting and importing of character data hosted on Curse. The Addon does not perform calculations like Rawr, it simply shows your exported Rawr data in wow tooltips and lets you export your character to Rawr (including bag and bank items) like Char...Home Access Plus+: v7.5: Change Log: New Booking System (out of Beta) New Help Desk (out of Beta) New My Files (Developer Preview) Token now saved into Cookie so the system doesn't TIMEOUT as much File Changes: ~/bin/hap.ad.dll ~/bin/hap.web.dll ~/bin/hap.data.dll ~/bin/hap.web.configuration.dll ~/bookingsystem/admin/default.aspx ~/bookingsystem/default.aspx REMOVED ~/bookingsystem/bookingpopup.ascx REMOVED ~/bookingsystem/daylist.ascx REMOVED ~/bookingsystem/new.aspx ~/helpdesk/default.aspx ...Visual Micro - Arduino for Visual Studio: Arduino for Visual Studio 2008 and 2010: Arduino for Visual Studio 2010 has been extended to support Visual Studio 2008. The same functionality and configuration exists between the two versions. The 2010 addin runs .NET4 and the 2008 addin runs .NET3.5, both are installed using a single msi and both share the same configuration settings. The only known issue in 2008 is that the button and menu icons are missing. Please logon to the visual micro forum and let us know if things are working or not. Read more about this Visual Studio ...New Projects#foo Core: Core functionality extensions of .NET used by all HashFoo projects.#foo Nhib: #foo NHibernate extensions.Aagust G: Hello all ! Its a free JQuery Image Slider....ACP Log Analyzer: ACP Log Analyzer provides a quick and easy mechanism for generating a report on your ACP-based astronomical observing activities. Developed in Microsoft Visual Studio 2010 using C#, the .NET Framework version 4 and WPF.BlobShare Sample: TBDCompletedWorkflowCleanUp: This tool once executed on a list delete all completed workflow instancesCRM 2011 Visual Ribbon Editor: Visual Ribbon Editor is a tool for Microsoft Dynamics CRM 2011 that lets you edit CRM ribbons. This ribbon editor shows a preview of the CRM ribbon as you are editing it and allows you to add ribbon buttons and groups without needing to fully understand the ribbon XML schema.GearMerge: Organizes Movies and TV Series files from one Hard Drive to another. I created it for myself to update external drives with movies and TV shows from my collection.Generic Object Storage Helper for WinRT: ObjectStorageHelper<T> is a Generic class that simplifies storage of data in WinRT applications.Government Sanctioned Espionage RPG: Government Sanctioned is a modern SRD-like espionage game server. Visit http://wiki.government-sanctioned.us:8040 for game design and play information or homepage http://www.government-sanctioned.us Government Sanctioned is an online, text-based espionage RPG (similar to a MUD/MOO) that takes place against the backdrop of a highly-secretive U.S. Government agency whose stated goals don't always match the dirty work the agents tend to find themselves in. - over 15 starting profession...GridLibre para Visual FoxPro: GridLibre Para Visual FoxPro: esta herramienta ayudara a los usuarios y programadores en los manejos de los datos, como Filtrar, multiseleccion y el autoformato a las columnas como la asignacion del controlsource.HTML5 Video Web Part for SharePoint 2010: A web part for SharePoint 2010 that enable users playing video into the page using the Ribbon bar.Jogo do Galo: JOGO DO GALO REGRAS •O tabuleiro é a matriz de três linhas em três colunas. •Dois jogadores escolhem três peças cada um. •Os jogadores jogam alternadamente, uma peça de cada vez, num espaço que esteja vazio. •O objectivo é conseguir três peças iguais em linha, quer horizontal, vKarol sie uczy silverlajta: on sie naprawde tego uczy!Manejo de tags - PHP sobre apache: Hago uso de la libreria Microsoft Tag PHP Library para que pueda funcionar la aplicación sobre Apache finalmente puede crear tag de micrsosoft desde el formulario creado. Modem based SMS Gateway: It is an easy to use modem based SMS server that provide easier solutions for SMS marketing or SMS based services. It is highly programmable and the easy to use API interface makes SMS integration very easy. Embedded SMS processor provides customized solution to many of your needs even without building any custom software.Mund Publishing Feture: Mund Publishing FeatureMyTFSTest: TestNHS HL7 CDA Document Developer: A project to demonstrate how templated HL7 CDA documents can be created by using a common API. The API is designed to be used in .NET applications. A number of examples are provided using C#OpenShell: OpenShell is an open source implementation of the Windows PowerShell engine. It is to make integrating PowerShell into standalone console programs simple.Powershell Script to Copy Lists in a Site Collection in MOSS 2007 and SPS 2010: Hi, This is a powershell script file that copies a list within the same site collection. This works in Sharepoint 2007 and Sharepoint 2010 as well. THis will flash the messages before taking the input values. This will in this way provide the clear ideas about the values to beSharePoint Desktop: SharePoint Desktop is a explorer-like webpart that makes it possible to drag and drop items (documents and folders), copy and paste items and explore all SharePoint document libraries in 1 place.SQL floating point compare function: Comparison of floating point values in SQL Server not always gives the expected result. With this function, comparison is only done on the first 15 significant digits. Since SQL Server only garantees a precision of 15 digits for float datatypes, this is expected to be secure.Stock Analyzer: It is a stock management software. It's main job is to store market realtime data on a database to be able to analyse it latter and create automatic systems that operate directly on the stock exchange market. It will have different modules to do this task: - Realtime data capture. - Realtime data analysis - Historic analysis. - Order execution. - Strategy test. - Strategy execution. It's developed in C# and with WPF.VB_Skype: VB_Skype utilizza la libreria Skype4COM per integrare i servizi Skype con un'applicazione Visual Basic (Windows Forms). L'applicazione comprende un progetto di controllo personalizzato che costituisce il "wrapper" verso la libreria Skype4COM e un progetto con una demo di utilizzo. Progetto che dovrebbe essere utilizzato nella mia sessione, come uno degli speaker della conferenza "WPC 2011" che si terrà ad Assago (MI) nei giorni 22-23-24 Novembre 2011. La mia sessione è in agenda per il 24...Word Template Generator: Custom Template Generator for Microsoft Word, developed in C#?????OA??: ?????OA??

    Read the article

  • How to create a new WCF/MVC/jQuery application from scratch

    - by pjohnson
    As a corporate developer by trade, I don't get much opportunity to create from-the-ground-up web sites; usually it's tweaks, fixes, and new functionality to existing sites. And with hobby sites, I often don't find the challenges I run into with enterprise systems; usually it's starting from Visual Studio's boilerplate project and adding whatever functionality I want to play around with, rarely deploying outside my own machine. So my experience creating a new enterprise-level site was a bit dated, and the technologies to do so have come a long way, and are much more ready to go out of the box. My intention with this post isn't so much to provide any groundbreaking insights, but to just tie together a lot of information in one place to make it easy to create a new site from scratch. Architecture One site I created earlier this year had an MVC 3 front end and a WCF 4-driven service layer. Using Visual Studio 2010, these project types are easy enough to add to a new solution. I created a third Class Library project to store common functionality the front end and services layers both needed to access, for example, the DataContract classes that the front end uses to call services in the service layer. By keeping DataContract classes in a separate project, I avoided the need for the front end to have an assembly/project reference directly to the services code, a bit cleaner and more flexible of an SOA implementation. Consuming the service Even by this point, VS has given you a lot. You have a working web site and a working service, neither of which do much but are great starting points. To wire up the front end and the services, I needed to create proxy classes and WCF client configuration information. I decided to use the SvcUtil.exe utility provided as part of the Windows SDK, which you should have installed if you installed VS. VS also provides an Add Service Reference command since the .NET 1.x ASMX days, which I've never really liked; it creates several .cs/.disco/etc. files, some of which contained hardcoded URL's, adding duplicate files (*1.cs, *2.cs, etc.) without doing a good job of cleaning up after itself. I've found SvcUtil much cleaner, as it outputs one C# file (containing several proxy classes) and a config file with settings, and it's easier to use to regenerate the proxy classes when the service changes, and to then maintain all your configuration in one place (your Web.config, instead of the Service Reference files). I provided it a reference to a copy of my common assembly so it doesn't try to recreate the data contract classes, had it use the type List<T> for collections, and modified the output files' names and .NET namespace, ending up with a command like: svcutil.exe /l:cs /o:MyService.cs /config:MyService.config /r:MySite.Common.dll /ct:System.Collections.Generic.List`1 /n:*,MySite.Web.ServiceProxies http://localhost:59999/MyService.svc I took the generated MyService.cs file and drop it in the web project, under a ServiceProxies folder, matching the namespace and keeping it separate from classes I coded manually. Integrating the config file took a little more work, but only needed to be done once as these settings didn't often change. A great thing Microsoft improved with WCF 4 is configuration; namely, you can use all the default settings and not have to specify them explicitly in your config file. Unfortunately, SvcUtil doesn't generate its config file this way. If you just copy & paste MyService.config's contents into your front end's Web.config, you'll copy a lot of settings you don't need, plus this will get unwieldy if you add more services in the future, each with its own custom binding. Really, as the only mandatory settings are the endpoint's ABC's (address, binding, and contract) you can get away with just this: <system.serviceModel>  <client>    <endpoint address="http://localhost:59999/MyService.svc" binding="wsHttpBinding" contract="MySite.Web.ServiceProxies.IMyService" />  </client></system.serviceModel> By default, the services project uses basicHttpBinding. As you can see, I switched it to wsHttpBinding, a more modern standard. Using something like netTcpBinding would probably be faster and more efficient since the client & service are both written in .NET, but it requires additional server setup and open ports, whereas switching to wsHttpBinding is much simpler. From an MVC controller action method, I instantiated the client, and invoked the method for my operation. As with any object that implements IDisposable, I wrapped it in C#'s using() statement, a tidy construct that ensures Dispose gets called no matter what, even if an exception occurs. Unfortunately there are problems with that, as WCF's ClientBase<TChannel> class doesn't implement Dispose according to Microsoft's own usage guidelines. I took an approach similar to Technology Toolbox's fix, except using partial classes instead of a wrapper class to extend the SvcUtil-generated proxy, making the fix more seamless from the controller's perspective, and theoretically, less code I have to change if and when Microsoft fixes this behavior. User interface The MVC 3 project template includes jQuery and some other common JavaScript libraries by default. I updated the ones I used to the latest versions using NuGet, available in VS via the Tools > Library Package Manager > Manage NuGet Packages for Solution... > Updates. I also used this dialog to remove packages I wasn't using. Given that it's smart enough to know the difference between the .js and .min.js files, I was hoping it would be smart enough to know which to include during build and publish operations, but this doesn't seem to be the case. I ended up using Cassette to perform the minification and bundling of my JavaScript and CSS files; ASP.NET 4.5 includes this functionality out of the box. The web client to web server link via jQuery was easy enough. In my JavaScript function, unobtrusively wired up to a button's click event, I called $.ajax, corresponding to an action method that returns a JsonResult, accomplished by passing my model class to the Controller.Json() method, which jQuery helpfully translates from JSON to a JavaScript object.$.ajax calls weren't perfectly straightforward. I tried using the simpler $.post method instead, but ran into trouble without specifying the contentType parameter, which $.post doesn't have. The url parameter is simple enough, though for flexibility in how the site is deployed, I used MVC's Url.Action method to get the URL, then sent this to JavaScript in a JavaScript string variable. If the request needed input data, I used the JSON.stringify function to convert a JavaScript object with the parameters into a JSON string, which MVC then parses into strongly-typed C# parameters. I also specified "json" for dataType, and "application/json; charset=utf-8" for contentType. For success and error, I provided my success and error handling functions, though success is a bit hairier. "Success" in this context indicates whether the HTTP request succeeds, not whether what you wanted the AJAX call to do on the web server was successful. For example, if you make an AJAX call to retrieve a piece of data, the success handler will be invoked for any 200 OK response, and the error handler will be invoked for failed requests, e.g. a 404 Not Found (if the server rejected the URL you provided in the url parameter) or 500 Internal Server Error (e.g. if your C# code threw an exception that wasn't caught). If an exception was caught and handled, or if the data requested wasn't found, this would likely go through the success handler, which would need to do further examination to verify it did in fact get back the data for which it asked. I discuss this more in the next section. Logging and exception handling At this point, I had a working application. If I ran into any errors or unexpected behavior, debugging was easy enough, but of course that's not an option on public web servers. Microsoft Enterprise Library 5.0 filled this gap nicely, with its Logging and Exception Handling functionality. First I installed Enterprise Library; NuGet as outlined above is probably the best way to do so. I needed a total of three assembly references--Microsoft.Practices.EnterpriseLibrary.ExceptionHandling, Microsoft.Practices.EnterpriseLibrary.ExceptionHandling.Logging, and Microsoft.Practices.EnterpriseLibrary.Logging. VS links with the handy Enterprise Library 5.0 Configuration Console, accessible by right-clicking your Web.config and choosing Edit Enterprise Library V5 Configuration. In this console, under Logging Settings, I set up a Rolling Flat File Trace Listener to write to log files but not let them get too large, using a Text Formatter with a simpler template than that provided by default. Logging to a different (or additional) destination is easy enough, but a flat file suited my needs. At this point, I verified it wrote as expected by calling the Microsoft.Practices.EnterpriseLibrary.Logging.Logger.Write method from my C# code. With those settings verified, I went on to wire up Exception Handling with Logging. Back in the EntLib Configuration Console, under Exception Handling, I used a LoggingExceptionHandler, setting its Logging Category to the category I already had configured in the Logging Settings. Then, from code (e.g. a controller's OnException method, or any action method's catch block), I called the Microsoft.Practices.EnterpriseLibrary.ExceptionHandling.ExceptionPolicy.HandleException method, providing the exception and the exception policy name I had configured in the Exception Handling Settings. Before I got this configured correctly, when I tried it out, nothing was logged. In working with .NET, I'm used to seeing an exception if something doesn't work or isn't set up correctly, but instead working with these EntLib modules reminds me more of JavaScript (before the "use strict" v5 days)--it just does nothing and leaves you to figure out why, I presume due in part to the listener pattern Microsoft followed with the Enterprise Library. First, I verified logging worked on its own. Then, verifying/correcting where each piece wires up to the next resolved my problem. Your C# code calls into the Exception Handling module, referencing the policy you pass the HandleException method; that policy's configuration contains a LoggingExceptionHandler that references a logCategory; that logCategory should be added in the loggingConfiguration's categorySources section; that category references a listener; that listener should be added in the loggingConfiguration's listeners section, which specifies the name of the log file. One final note on error handling, as the proper way to handle WCF and MVC errors is a whole other very lengthy discussion. For AJAX calls to MVC action methods, depending on your configuration, an exception thrown here will result in ASP.NET'S Yellow Screen Of Death being sent back as a response, which is at best unnecessarily and uselessly verbose, and at worst a security risk as the internals of your application are exposed to potential hackers. I mitigated this by overriding my controller's OnException method, passing the exception off to the Exception Handling module as above. I created an ErrorModel class with as few properties as possible (e.g. an Error string), sending as little information to the client as possible, to both maximize bandwidth and mitigate risk. I then return an ErrorModel in JSON format for AJAX requests: if (filterContext.HttpContext.Request.IsAjaxRequest()){    filterContext.Result = Json(new ErrorModel(...));    filterContext.ExceptionHandled = true;} My $.ajax calls from the browser get a valid 200 OK response and go into the success handler. Before assuming everything is OK, I check if it's an ErrorModel or a model containing what I requested. If it's an ErrorModel, or null, I pass it to my error handler. If the client needs to handle different errors differently, ErrorModel can contain a flag, error code, string, etc. to differentiate, but again, sending as little information back as possible is ideal. Summary As any experienced ASP.NET developer knows, this is a far cry from where ASP.NET started when I began working with it 11 years ago. WCF services are far more powerful than ASMX ones, MVC is in many ways cleaner and certainly more unit test-friendly than Web Forms (if you don't consider the code/markup commingling you're doing again), the Enterprise Library makes error handling and logging almost entirely configuration-driven, AJAX makes a responsive UI more feasible, and jQuery makes JavaScript coding much less painful. It doesn't take much work to get a functional, maintainable, flexible application, though having it actually do something useful is a whole other matter.

    Read the article

  • Wireless card power management

    - by penner
    I have noticed that when my computer in plugged in, the wireless strength increases. I'm assuming this is to do with power management. Is there a way to disable Wireless Power Management? I have found a few blog posts that show hacks to disable this but what is best practice here? Should there not be an option via the power menu that lets you toggle this? EDIT -- FILES AND LOGS AS REQUESTED /var/log/kern.log Jul 11 11:45:27 CoolBreeze kernel: [ 6.528052] postgres (1308): /proc/1308/oom_adj is deprecated, please use /proc/1308/oom_score_adj instead. Jul 11 11:45:27 CoolBreeze kernel: [ 6.532080] [fglrx] Gart USWC size:1280 M. Jul 11 11:45:27 CoolBreeze kernel: [ 6.532084] [fglrx] Gart cacheable size:508 M. Jul 11 11:45:27 CoolBreeze kernel: [ 6.532091] [fglrx] Reserved FB block: Shared offset:0, size:1000000 Jul 11 11:45:27 CoolBreeze kernel: [ 6.532094] [fglrx] Reserved FB block: Unshared offset:f8fd000, size:403000 Jul 11 11:45:27 CoolBreeze kernel: [ 6.532098] [fglrx] Reserved FB block: Unshared offset:3fff4000, size:c000 Jul 11 11:45:38 CoolBreeze kernel: [ 17.423743] eth1: no IPv6 routers present Jul 11 11:46:37 CoolBreeze kernel: [ 75.836426] warning: `proftpd' uses 32-bit capabilities (legacy support in use) Jul 11 11:46:37 CoolBreeze kernel: [ 75.884215] init: plymouth-stop pre-start process (2922) terminated with status 1 Jul 11 11:54:25 CoolBreeze kernel: [ 543.679614] eth1: no IPv6 routers present dmesg [ 1.411959] ACPI: Power Button [PWRB] [ 1.412046] input: Sleep Button as /devices/LNXSYSTM:00/device:00/PNP0C0E:00/input/input1 [ 1.412054] ACPI: Sleep Button [SLPB] [ 1.412150] input: Lid Switch as /devices/LNXSYSTM:00/device:00/PNP0C0D:00/input/input2 [ 1.412765] ACPI: Lid Switch [LID0] [ 1.412866] input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 [ 1.412874] ACPI: Power Button [PWRF] [ 1.412996] ACPI: Fan [FAN0] (off) [ 1.413068] ACPI: Fan [FAN1] (off) [ 1.419493] thermal LNXTHERM:00: registered as thermal_zone0 [ 1.419498] ACPI: Thermal Zone [TZ00] (27 C) [ 1.421913] thermal LNXTHERM:01: registered as thermal_zone1 [ 1.421918] ACPI: Thermal Zone [TZ01] (61 C) [ 1.421971] ACPI: Deprecated procfs I/F for battery is loaded, please retry with CONFIG_ACPI_PROCFS_POWER cleared [ 1.421986] ACPI: Battery Slot [BAT0] (battery present) [ 1.422062] ERST: Table is not found! [ 1.422067] GHES: HEST is not enabled! [ 1.422158] isapnp: Scanning for PnP cards... [ 1.422242] Serial: 8250/16550 driver, 32 ports, IRQ sharing enabled [ 1.434620] ACPI: Battery Slot [BAT0] (battery present) [ 1.736355] Freeing initrd memory: 14352k freed [ 1.777846] isapnp: No Plug & Play device found [ 1.963650] Linux agpgart interface v0.103 [ 1.967148] brd: module loaded [ 1.968866] loop: module loaded [ 1.969134] ahci 0000:00:1f.2: version 3.0 [ 1.969154] ahci 0000:00:1f.2: PCI INT B -> GSI 19 (level, low) -> IRQ 19 [ 1.969226] ahci 0000:00:1f.2: irq 45 for MSI/MSI-X [ 1.969277] ahci: SSS flag set, parallel bus scan disabled [ 1.969320] ahci 0000:00:1f.2: AHCI 0001.0300 32 slots 6 ports 3 Gbps 0x23 impl SATA mode [ 1.969329] ahci 0000:00:1f.2: flags: 64bit ncq sntf stag pm led clo pio slum part ems sxs apst [ 1.969338] ahci 0000:00:1f.2: setting latency timer to 64 [ 1.983340] scsi0 : ahci [ 1.983515] scsi1 : ahci [ 1.983670] scsi2 : ahci [ 1.983829] scsi3 : ahci [ 1.983985] scsi4 : ahci [ 1.984145] scsi5 : ahci [ 1.984270] ata1: SATA max UDMA/133 abar m2048@0xf1005000 port 0xf1005100 irq 45 [ 1.984277] ata2: SATA max UDMA/133 abar m2048@0xf1005000 port 0xf1005180 irq 45 [ 1.984282] ata3: DUMMY [ 1.984285] ata4: DUMMY [ 1.984288] ata5: DUMMY [ 1.984292] ata6: SATA max UDMA/133 abar m2048@0xf1005000 port 0xf1005380 irq 45 [ 1.985150] Fixed MDIO Bus: probed [ 1.985192] tun: Universal TUN/TAP device driver, 1.6 [ 1.985196] tun: (C) 1999-2004 Max Krasnyansky <[email protected]> [ 1.985285] PPP generic driver version 2.4.2 [ 1.985472] ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver [ 1.985507] ehci_hcd 0000:00:1a.0: PCI INT A -> GSI 16 (level, low) -> IRQ 16 [ 1.985534] ehci_hcd 0000:00:1a.0: setting latency timer to 64 [ 1.985541] ehci_hcd 0000:00:1a.0: EHCI Host Controller [ 1.985626] ehci_hcd 0000:00:1a.0: new USB bus registered, assigned bus number 1 [ 1.985666] ehci_hcd 0000:00:1a.0: debug port 2 [ 1.989663] ehci_hcd 0000:00:1a.0: cache line size of 64 is not supported [ 1.989690] ehci_hcd 0000:00:1a.0: irq 16, io mem 0xf1005800 [ 2.002183] ehci_hcd 0000:00:1a.0: USB 2.0 started, EHCI 1.00 [ 2.002447] hub 1-0:1.0: USB hub found [ 2.002455] hub 1-0:1.0: 3 ports detected [ 2.002607] ehci_hcd 0000:00:1d.0: PCI INT A -> GSI 23 (level, low) -> IRQ 23 [ 2.002633] ehci_hcd 0000:00:1d.0: setting latency timer to 64 [ 2.002639] ehci_hcd 0000:00:1d.0: EHCI Host Controller [ 2.002737] ehci_hcd 0000:00:1d.0: new USB bus registered, assigned bus number 2 [ 2.002775] ehci_hcd 0000:00:1d.0: debug port 2 [ 2.006780] ehci_hcd 0000:00:1d.0: cache line size of 64 is not supported [ 2.006806] ehci_hcd 0000:00:1d.0: irq 23, io mem 0xf1005c00 [ 2.022161] ehci_hcd 0000:00:1d.0: USB 2.0 started, EHCI 1.00 [ 2.022401] hub 2-0:1.0: USB hub found [ 2.022409] hub 2-0:1.0: 3 ports detected [ 2.022567] ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver [ 2.022599] uhci_hcd: USB Universal Host Controller Interface driver [ 2.022720] usbcore: registered new interface driver libusual [ 2.022813] i8042: PNP: PS/2 Controller [PNP0303:PS2K,PNP0f13:PS2M] at 0x60,0x64 irq 1,12 [ 2.035831] serio: i8042 KBD port at 0x60,0x64 irq 1 [ 2.035844] serio: i8042 AUX port at 0x60,0x64 irq 12 [ 2.036096] mousedev: PS/2 mouse device common for all mice [ 2.036710] rtc_cmos 00:07: RTC can wake from S4 [ 2.036881] rtc_cmos 00:07: rtc core: registered rtc_cmos as rtc0 [ 2.037143] rtc0: alarms up to one month, y3k, 242 bytes nvram, hpet irqs [ 2.037503] device-mapper: uevent: version 1.0.3 [ 2.037656] device-mapper: ioctl: 4.22.0-ioctl (2011-10-19) initialised: [email protected] [ 2.037725] EISA: Probing bus 0 at eisa.0 [ 2.037729] EISA: Cannot allocate resource for mainboard [ 2.037734] Cannot allocate resource for EISA slot 1 [ 2.037738] Cannot allocate resource for EISA slot 2 [ 2.037741] Cannot allocate resource for EISA slot 3 [ 2.037745] Cannot allocate resource for EISA slot 4 [ 2.037749] Cannot allocate resource for EISA slot 5 [ 2.037753] Cannot allocate resource for EISA slot 6 [ 2.037756] Cannot allocate resource for EISA slot 7 [ 2.037760] Cannot allocate resource for EISA slot 8 [ 2.037764] EISA: Detected 0 cards. [ 2.037782] cpufreq-nforce2: No nForce2 chipset. [ 2.038264] cpuidle: using governor ladder [ 2.039015] cpuidle: using governor menu [ 2.039019] EFI Variables Facility v0.08 2004-May-17 [ 2.040061] TCP cubic registered [ 2.041438] NET: Registered protocol family 10 [ 2.043814] NET: Registered protocol family 17 [ 2.043823] Registering the dns_resolver key type [ 2.044290] input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input4 [ 2.044336] Using IPI No-Shortcut mode [ 2.045620] PM: Hibernation image not present or could not be loaded. [ 2.045644] registered taskstats version 1 [ 2.073070] Magic number: 4:976:796 [ 2.073415] rtc_cmos 00:07: setting system clock to 2012-07-11 18:45:23 UTC (1342032323) [ 2.076654] BIOS EDD facility v0.16 2004-Jun-25, 0 devices found [ 2.076658] EDD information not available. [ 2.302111] ata1: SATA link up 3.0 Gbps (SStatus 123 SControl 300) [ 2.302587] ata1.00: ATA-9: M4-CT128M4SSD2, 000F, max UDMA/100 [ 2.302595] ata1.00: 250069680 sectors, multi 16: LBA48 NCQ (depth 31/32), AA [ 2.303143] ata1.00: configured for UDMA/100 [ 2.303453] scsi 0:0:0:0: Direct-Access ATA M4-CT128M4SSD2 000F PQ: 0 ANSI: 5 [ 2.303746] sd 0:0:0:0: Attached scsi generic sg0 type 0 [ 2.303920] sd 0:0:0:0: [sda] 250069680 512-byte logical blocks: (128 GB/119 GiB) [ 2.304213] sd 0:0:0:0: [sda] Write Protect is off [ 2.304225] sd 0:0:0:0: [sda] Mode Sense: 00 3a 00 00 [ 2.304471] sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA [ 2.306818] sda: sda1 sda2 < sda5 > [ 2.308780] sd 0:0:0:0: [sda] Attached SCSI disk [ 2.318162] Refined TSC clocksource calibration: 1595.999 MHz. [ 2.318169] usb 1-1: new high-speed USB device number 2 using ehci_hcd [ 2.318178] Switching to clocksource tsc [ 2.450939] hub 1-1:1.0: USB hub found [ 2.451121] hub 1-1:1.0: 6 ports detected [ 2.561786] usb 2-1: new high-speed USB device number 2 using ehci_hcd [ 2.621757] ata2: SATA link up 1.5 Gbps (SStatus 113 SControl 300) [ 2.636143] ata2.00: ATAPI: TSSTcorp DVD+/-RW TS-T633C, D800, max UDMA/100 [ 2.636152] ata2.00: applying bridge limits [ 2.649711] ata2.00: configured for UDMA/100 [ 2.653762] scsi 1:0:0:0: CD-ROM TSSTcorp DVD+-RW TS-T633C D800 PQ: 0 ANSI: 5 [ 2.661486] sr0: scsi3-mmc drive: 24x/24x writer dvd-ram cd/rw xa/form2 cdda tray [ 2.661494] cdrom: Uniform CD-ROM driver Revision: 3.20 [ 2.661890] sr 1:0:0:0: Attached scsi CD-ROM sr0 [ 2.662156] sr 1:0:0:0: Attached scsi generic sg1 type 5 [ 2.694649] hub 2-1:1.0: USB hub found [ 2.694840] hub 2-1:1.0: 8 ports detected [ 2.765823] usb 1-1.4: new high-speed USB device number 3 using ehci_hcd [ 2.981454] ata6: SATA link down (SStatus 0 SControl 300) [ 2.982597] Freeing unused kernel memory: 740k freed [ 2.983523] Write protecting the kernel text: 5816k [ 2.983808] Write protecting the kernel read-only data: 2376k [ 2.983811] NX-protecting the kernel data: 4424k [ 3.014594] udevd[127]: starting version 175 [ 3.068925] sdhci: Secure Digital Host Controller Interface driver [ 3.068932] sdhci: Copyright(c) Pierre Ossman [ 3.069714] sdhci-pci 0000:09:00.0: SDHCI controller found [1180:e822] (rev 1) [ 3.069742] sdhci-pci 0000:09:00.0: PCI INT A -> GSI 16 (level, low) -> IRQ 16 [ 3.069786] sdhci-pci 0000:09:00.0: Will use DMA mode even though HW doesn't fully claim to support it. [ 3.069798] sdhci-pci 0000:09:00.0: setting latency timer to 64 [ 3.069816] mmc0: no vmmc regulator found [ 3.069877] Registered led device: mmc0:: [ 3.070946] mmc0: SDHCI controller on PCI [0000:09:00.0] using DMA [ 3.071078] tg3.c:v3.121 (November 2, 2011) [ 3.071252] tg3 0000:0b:00.0: PCI INT A -> GSI 17 (level, low) -> IRQ 17 [ 3.071269] tg3 0000:0b:00.0: setting latency timer to 64 [ 3.071403] firewire_ohci 0000:09:00.3: PCI INT D -> GSI 19 (level, low) -> IRQ 19 [ 3.071417] firewire_ohci 0000:09:00.3: setting latency timer to 64 [ 3.078509] EXT4-fs (sda1): INFO: recovery required on readonly filesystem [ 3.078517] EXT4-fs (sda1): write access will be enabled during recovery [ 3.110417] tg3 0000:0b:00.0: eth0: Tigon3 [partno(BCM95784M) rev 5784100] (PCI Express) MAC address b8:ac:6f:71:02:a6 [ 3.110425] tg3 0000:0b:00.0: eth0: attached PHY is 5784 (10/100/1000Base-T Ethernet) (WireSpeed[1], EEE[0]) [ 3.110431] tg3 0000:0b:00.0: eth0: RXcsums[1] LinkChgREG[0] MIirq[0] ASF[0] TSOcap[1] [ 3.110436] tg3 0000:0b:00.0: eth0: dma_rwctrl[76180000] dma_mask[64-bit] [ 3.125492] firewire_ohci: Added fw-ohci device 0000:09:00.3, OHCI v1.10, 4 IR + 4 IT contexts, quirks 0x11 [ 3.390124] EXT4-fs (sda1): orphan cleanup on readonly fs [ 3.390135] EXT4-fs (sda1): ext4_orphan_cleanup: deleting unreferenced inode 7078710 [ 3.390232] EXT4-fs (sda1): ext4_orphan_cleanup: deleting unreferenced inode 2363071 [ 3.390327] EXT4-fs (sda1): ext4_orphan_cleanup: deleting unreferenced inode 7078711 [ 3.390350] EXT4-fs (sda1): ext4_orphan_cleanup: deleting unreferenced inode 7078709 [ 3.390367] EXT4-fs (sda1): ext4_orphan_cleanup: deleting unreferenced inode 7078708 [ 3.390384] EXT4-fs (sda1): ext4_orphan_cleanup: deleting unreferenced inode 7078707 [ 3.390401] EXT4-fs (sda1): ext4_orphan_cleanup: deleting unreferenced inode 7078706 [ 3.390417] EXT4-fs (sda1): ext4_orphan_cleanup: deleting unreferenced inode 7078705 [ 3.390435] EXT4-fs (sda1): ext4_orphan_cleanup: deleting unreferenced inode 7078551 [ 3.390452] EXT4-fs (sda1): ext4_orphan_cleanup: deleting unreferenced inode 7078523 [ 3.390470] EXT4-fs (sda1): ext4_orphan_cleanup: deleting unreferenced inode 7078520 [ 3.390487] EXT4-fs (sda1): ext4_orphan_cleanup: deleting unreferenced inode 7077901 [ 3.390551] EXT4-fs (sda1): ext4_orphan_cleanup: deleting unreferenced inode 4063272 [ 3.390562] EXT4-fs (sda1): ext4_orphan_cleanup: deleting unreferenced inode 4063266 [ 3.390572] EXT4-fs (sda1): ext4_orphan_cleanup: deleting unreferenced inode 4063261 [ 3.390582] EXT4-fs (sda1): ext4_orphan_cleanup: deleting unreferenced inode 4063256 [ 3.390592] EXT4-fs (sda1): ext4_orphan_cleanup: deleting unreferenced inode 4063255 [ 3.390602] EXT4-fs (sda1): ext4_orphan_cleanup: deleting unreferenced inode 2363072 [ 3.390620] EXT4-fs (sda1): ext4_orphan_cleanup: deleting unreferenced inode 2360050 [ 3.390698] EXT4-fs (sda1): ext4_orphan_cleanup: deleting unreferenced inode 5250064 [ 3.390710] EXT4-fs (sda1): ext4_orphan_cleanup: deleting unreferenced inode 2365394 [ 3.390728] EXT4-fs (sda1): ext4_orphan_cleanup: deleting unreferenced inode 2365390 [ 3.390745] EXT4-fs (sda1): 22 orphan inodes deleted [ 3.390748] EXT4-fs (sda1): recovery complete [ 3.397636] EXT4-fs (sda1): mounted filesystem with ordered data mode. Opts: (null) [ 3.624910] firewire_core: created device fw0: GUID 464fc000110e2661, S400 [ 3.927467] ADDRCONF(NETDEV_UP): eth0: link is not ready [ 3.929965] udevd[400]: starting version 175 [ 3.933581] Adding 6278140k swap on /dev/sda5. Priority:-1 extents:1 across:6278140k SS [ 3.945183] lp: driver loaded but no devices found [ 3.999389] wmi: Mapper loaded [ 4.016696] ite_cir: Auto-detected model: ITE8708 CIR transceiver [ 4.016702] ite_cir: Using model: ITE8708 CIR transceiver [ 4.016706] ite_cir: TX-capable: 1 [ 4.016710] ite_cir: Sample period (ns): 8680 [ 4.016713] ite_cir: TX carrier frequency (Hz): 38000 [ 4.016716] ite_cir: TX duty cycle (%): 33 [ 4.016719] ite_cir: RX low carrier frequency (Hz): 0 [ 4.016722] ite_cir: RX high carrier frequency (Hz): 0 [ 4.025684] fglrx: module license 'Proprietary. (C) 2002 - ATI Technologies, Starnberg, GERMANY' taints kernel. [ 4.025691] Disabling lock debugging due to kernel taint [ 4.027410] IR NEC protocol handler initialized [ 4.030250] lib80211: common routines for IEEE802.11 drivers [ 4.030257] lib80211_crypt: registered algorithm 'NULL' [ 4.036024] IR RC5(x) protocol handler initialized [ 4.036092] snd_hda_intel 0000:00:1b.0: PCI INT A -> GSI 22 (level, low) -> IRQ 22 [ 4.036188] snd_hda_intel 0000:00:1b.0: irq 46 for MSI/MSI-X [ 4.036307] snd_hda_intel 0000:00:1b.0: setting latency timer to 64 [ 4.036361] [Firmware Bug]: ACPI: No _BQC method, cannot determine initial brightness [ 4.039006] acpi device:03: registered as cooling_device10 [ 4.039164] input: Video Bus as /devices/LNXSYSTM:00/device:00/PNP0A08:00/device:01/LNXVIDEO:00/input/input5 [ 4.039261] ACPI: Video Device [M86] (multi-head: yes rom: no post: no) [ 4.049753] EXT4-fs (sda1): re-mounted. Opts: errors=remount-ro [ 4.050201] wl 0000:05:00.0: PCI INT A -> GSI 17 (level, low) -> IRQ 17 [ 4.050215] wl 0000:05:00.0: setting latency timer to 64 [ 4.052252] Registered IR keymap rc-rc6-mce [ 4.052432] input: ITE8708 CIR transceiver as /devices/virtual/rc/rc0/input6 [ 4.054614] IR RC6 protocol handler initialized [ 4.054787] rc0: ITE8708 CIR transceiver as /devices/virtual/rc/rc0 [ 4.054839] ite_cir: driver has been successfully loaded [ 4.057338] IR JVC protocol handler initialized [ 4.061553] IR Sony protocol handler initialized [ 4.066578] input: MCE IR Keyboard/Mouse (ite-cir) as /devices/virtual/input/input7 [ 4.066724] IR MCE Keyboard/mouse protocol handler initialized [ 4.072580] lirc_dev: IR Remote Control driver registered, major 250 [ 4.073280] rc rc0: lirc_dev: driver ir-lirc-codec (ite-cir) registered at minor = 0 [ 4.073286] IR LIRC bridge handler initialized [ 4.077849] Linux video capture interface: v2.00 [ 4.079402] uvcvideo: Found UVC 1.00 device Laptop_Integrated_Webcam_2M (0c45:640f) [ 4.085492] EDAC MC: Ver: 2.1.0 [ 4.087138] lib80211_crypt: registered algorithm 'TKIP' [ 4.091027] input: HDA Intel Mic as /devices/pci0000:00/0000:00:1b.0/sound/card0/input8 [ 4.091733] snd_hda_intel 0000:02:00.1: PCI INT B -> GSI 17 (level, low) -> IRQ 17 [ 4.091826] snd_hda_intel 0000:02:00.1: irq 47 for MSI/MSI-X [ 4.091861] snd_hda_intel 0000:02:00.1: setting latency timer to 64 [ 4.093115] EDAC i7core: Device not found: dev 00.0 PCI ID 8086:2c50 [ 4.112448] HDMI status: Codec=0 Pin=3 Presence_Detect=0 ELD_Valid=0 [ 4.112612] input: HD-Audio Generic HDMI/DP,pcm=3 as /devices/pci0000:00/0000:00:03.0/0000:02:00.1/sound/card1/input9 [ 4.113311] type=1400 audit(1342032325.540:2): apparmor="STATUS" operation="profile_load" name="/sbin/dhclient" pid=658 comm="apparmor_parser" [ 4.114501] type=1400 audit(1342032325.540:3): apparmor="STATUS" operation="profile_load" name="/usr/lib/NetworkManager/nm-dhcp-client.action" pid=658 comm="apparmor_parser" [ 4.115253] type=1400 audit(1342032325.540:4): apparmor="STATUS" operation="profile_load" name="/usr/lib/connman/scripts/dhclient-script" pid=658 comm="apparmor_parser" [ 4.121870] input: Laptop_Integrated_Webcam_2M as /devices/pci0000:00/0000:00:1a.0/usb1/1-1/1-1.4/1-1.4:1.0/input/input10 [ 4.122096] usbcore: registered new interface driver uvcvideo [ 4.122100] USB Video Class driver (1.1.1) [ 4.128729] [fglrx] Maximum main memory to use for locked dma buffers: 5840 MBytes. [ 4.129678] [fglrx] vendor: 1002 device: 68c0 count: 1 [ 4.131991] [fglrx] ioport: bar 4, base 0x2000, size: 0x100 [ 4.132015] pci 0000:02:00.0: PCI INT A -> GSI 16 (level, low) -> IRQ 16 [ 4.132024] pci 0000:02:00.0: setting latency timer to 64 [ 4.133712] [fglrx] Kernel PAT support is enabled [ 4.133747] [fglrx] module loaded - fglrx 8.96.4 [Mar 12 2012] with 1 minors [ 4.162666] eth1: Broadcom BCM4727 802.11 Hybrid Wireless Controller 5.100.82.38 [ 4.184133] device-mapper: multipath: version 1.3.0 loaded [ 4.196660] dcdbas dcdbas: Dell Systems Management Base Driver (version 5.6.0-3.2) [ 4.279897] input: Dell WMI hotkeys as /devices/virtual/input/input11 [ 4.292402] Bluetooth: Core ver 2.16 [ 4.292449] NET: Registered protocol family 31 [ 4.292454] Bluetooth: HCI device and connection manager initialized [ 4.292459] Bluetooth: HCI socket layer initialized [ 4.292463] Bluetooth: L2CAP socket layer initialized [ 4.292473] Bluetooth: SCO socket layer initialized [ 4.296333] Bluetooth: RFCOMM TTY layer initialized [ 4.296342] Bluetooth: RFCOMM socket layer initialized [ 4.296345] Bluetooth: RFCOMM ver 1.11 [ 4.313586] ppdev: user-space parallel port driver [ 4.316619] Bluetooth: BNEP (Ethernet Emulation) ver 1.3 [ 4.316625] Bluetooth: BNEP filters: protocol multicast [ 4.383980] type=1400 audit(1342032325.812:5): apparmor="STATUS" operation="profile_load" name="/usr/lib/cups/backend/cups-pdf" pid=938 comm="apparmor_parser" [ 4.385173] type=1400 audit(1342032325.812:6): apparmor="STATUS" operation="profile_load" name="/usr/sbin/cupsd" pid=938 comm="apparmor_parser" [ 4.425757] init: failsafe main process (898) killed by TERM signal [ 4.477052] type=1400 audit(1342032325.904:7): apparmor="STATUS" operation="profile_replace" name="/sbin/dhclient" pid=1011 comm="apparmor_parser" [ 4.477592] type=1400 audit(1342032325.904:8): apparmor="STATUS" operation="profile_load" name="/usr/lib/lightdm/lightdm/lightdm-guest-session-wrapper" pid=1010 comm="apparmor_parser" [ 4.478099] type=1400 audit(1342032325.904:9): apparmor="STATUS" operation="profile_load" name="/usr/sbin/tcpdump" pid=1017 comm="apparmor_parser" [ 4.479233] type=1400 audit(1342032325.904:10): apparmor="STATUS" operation="profile_load" name="/usr/lib/telepathy/mission-control-5" pid=1014 comm="apparmor_parser" [ 4.510060] vesafb: mode is 1152x864x32, linelength=4608, pages=0 [ 4.510065] vesafb: scrolling: redraw [ 4.510071] vesafb: Truecolor: size=0:8:8:8, shift=0:16:8:0 [ 4.510084] mtrr: no more MTRRs available [ 4.513081] vesafb: framebuffer at 0xd0000000, mapped to 0xf9400000, using 3904k, total 3904k [ 4.515203] Console: switching to colour frame buffer device 144x54 [ 4.515278] fb0: VESA VGA frame buffer device [ 4.590743] tg3 0000:0b:00.0: irq 48 for MSI/MSI-X [ 4.702009] ADDRCONF(NETDEV_UP): eth0: link is not ready [ 4.704409] ADDRCONF(NETDEV_UP): eth0: link is not ready [ 4.978379] psmouse serio1: synaptics: Touchpad model: 1, fw: 7.2, id: 0x1c0b1, caps: 0xd04733/0xa40000/0xa0000 [ 5.030104] input: SynPS/2 Synaptics TouchPad as /devices/platform/i8042/serio1/input/input12 [ 5.045782] kvm: VM_EXIT_LOAD_IA32_PERF_GLOBAL_CTRL does not work properly. Using workaround [ 5.519573] [fglrx] ATIF platform detected with notification ID: 0x81 [ 6.391466] fglrx_pci 0000:02:00.0: irq 49 for MSI/MSI-X [ 6.393137] [fglrx] Firegl kernel thread PID: 1305 [ 6.393306] [fglrx] Firegl kernel thread PID: 1306 [ 6.393472] [fglrx] Firegl kernel thread PID: 1307 [ 6.393726] [fglrx] IRQ 49 Enabled [ 6.528052] postgres (1308): /proc/1308/oom_adj is deprecated, please use /proc/1308/oom_score_adj instead. [ 6.532080] [fglrx] Gart USWC size:1280 M. [ 6.532084] [fglrx] Gart cacheable size:508 M. [ 6.532091] [fglrx] Reserved FB block: Shared offset:0, size:1000000 [ 6.532094] [fglrx] Reserved FB block: Unshared offset:f8fd000, size:403000 [ 6.532098] [fglrx] Reserved FB block: Unshared offset:3fff4000, size:c000 [ 17.423743] eth1: no IPv6 routers present [ 75.836426] warning: `proftpd' uses 32-bit capabilities (legacy support in use) [ 75.884215] init: plymouth-stop pre-start process (2922) terminated with status 1 [ 543.679614] eth1: no IPv6 routers present lsmod Module Size Used by kvm_intel 127560 0 kvm 359456 1 kvm_intel joydev 17393 0 vesafb 13516 1 parport_pc 32114 0 bnep 17830 2 ppdev 12849 0 rfcomm 38139 0 bluetooth 158438 10 bnep,rfcomm dell_wmi 12601 0 sparse_keymap 13658 1 dell_wmi binfmt_misc 17292 1 dell_laptop 17767 0 dcdbas 14098 1 dell_laptop dm_multipath 22710 0 fglrx 2909855 143 snd_hda_codec_hdmi 31775 1 psmouse 72919 0 serio_raw 13027 0 i7core_edac 23382 0 lib80211_crypt_tkip 17275 0 edac_core 46858 1 i7core_edac uvcvideo 67203 0 snd_hda_codec_idt 60251 1 videodev 86588 1 uvcvideo ir_lirc_codec 12739 0 lirc_dev 18700 1 ir_lirc_codec ir_mce_kbd_decoder 12681 0 snd_seq_midi 13132 0 ir_sony_decoder 12462 0 ir_jvc_decoder 12459 0 snd_rawmidi 25424 1 snd_seq_midi ir_rc6_decoder 12459 0 wl 2646601 0 snd_seq_midi_event 14475 1 snd_seq_midi snd_seq 51567 2 snd_seq_midi,snd_seq_midi_event ir_rc5_decoder 12459 0 video 19068 0 snd_hda_intel 32765 5 snd_seq_device 14172 3 snd_seq_midi,snd_rawmidi,snd_seq snd_hda_codec 109562 3 snd_hda_codec_hdmi,snd_hda_codec_idt,snd_hda_intel rc_rc6_mce 12454 0 lib80211 14040 2 lib80211_crypt_tkip,wl snd_hwdep 13276 1 snd_hda_codec ir_nec_decoder 12459 0 snd_pcm 80845 3 snd_hda_codec_hdmi,snd_hda_intel,snd_hda_codec ite_cir 24743 0 rc_core 21263 10 ir_lirc_codec,ir_mce_kbd_decoder,ir_sony_decoder,ir_jvc_decoder,ir_rc6_decoder,ir_rc5_decoder,rc_rc6_mce,ir_nec_decoder,ite_cir snd_timer 28931 2 snd_seq,snd_pcm wmi 18744 1 dell_wmi snd 62064 20 snd_hda_codec_hdmi,snd_hda_codec_idt,snd_rawmidi,snd_seq,snd_seq_device,snd_hda_intel,snd_hda_codec,snd_hwdep,snd_pcm,snd_timer mac_hid 13077 0 soundcore 14635 1 snd snd_page_alloc 14108 2 snd_hda_intel,snd_pcm coretemp 13269 0 lp 17455 0 parport 40930 3 parport_pc,ppdev,lp tg3 141369 0 firewire_ohci 40172 0 sdhci_pci 18324 0 firewire_core 56906 1 firewire_ohci sdhci 28241 1 sdhci_pci crc_itu_t 12627 1 firewire_core lshw *-network description: Wireless interface product: BCM4313 802.11b/g/n Wireless LAN Controller vendor: Broadcom Corporation physical id: 0 bus info: pci@0000:05:00.0 logical name: eth1 version: 01 serial: 70:f1:a1:a9:54:31 width: 64 bits clock: 33MHz capabilities: pm msi pciexpress bus_master cap_list ethernet physical wireless configuration: broadcast=yes driver=wl0 driverversion=5.100.82.38 ip=192.168.0.117 latency=0 multicast=yes wireless=IEEE 802.11 resources: irq:17 memory:f0900000-f0903fff *-network description: Ethernet interface product: NetLink BCM5784M Gigabit Ethernet PCIe vendor: Broadcom Corporation physical id: 0 bus info: pci@0000:0b:00.0 logical name: eth0 version: 10 serial: b8:ac:6f:71:02:a6 capacity: 1Gbit/s width: 64 bits clock: 33MHz capabilities: pm vpd msi pciexpress bus_master cap_list ethernet physical tp 10bt 10bt-fd 100bt 100bt-fd 1000bt 1000bt-fd autonegotiation configuration: autonegotiation=on broadcast=yes driver=tg3 driverversion=3.121 firmware=sb v2.19 latency=0 link=no multicast=yes port=twisted pair resources: irq:48 memory:f0d00000-f0d0ffff

    Read the article

  • Virtual host is not working in Ubuntu 14 VPS using XAMPP 1.8.3

    - by viral4ever
    I am using XAMPP as server in ubuntu 14.04 VPS of digitalocean. I tried to setup virtual hosts. But it is not working and I am getting 403 error of access denied. I changed files too. My files with changes are /opt/lampp/etc/httpd.conf # # This is the main Apache HTTP server configuration file. It contains the # configuration directives that give the server its instructions. # See <URL:http://httpd.apache.org/docs/trunk/> for detailed information. # In particular, see # <URL:http://httpd.apache.org/docs/trunk/mod/directives.html> # for a discussion of each configuration directive. # # Do NOT simply read the instructions in here without understanding # what they do. They're here only as hints or reminders. If you are unsure # consult the online docs. You have been warned. # # Configuration and logfile names: If the filenames you specify for many # of the server's control files begin with "/" (or "drive:/" for Win32), the # server will use that explicit path. If the filenames do *not* begin # with "/", the value of ServerRoot is prepended -- so 'log/access_log' # with ServerRoot set to '/www' will be interpreted by the # server as '/www/log/access_log', where as '/log/access_log' will be # interpreted as '/log/access_log'. # # ServerRoot: The top of the directory tree under which the server's # configuration, error, and log files are kept. # # Do not add a slash at the end of the directory path. If you point # ServerRoot at a non-local disk, be sure to specify a local disk on the # Mutex directive, if file-based mutexes are used. If you wish to share the # same ServerRoot for multiple httpd daemons, you will need to change at # least PidFile. # ServerRoot "/opt/lampp" # # Mutex: Allows you to set the mutex mechanism and mutex file directory # for individual mutexes, or change the global defaults # # Uncomment and change the directory if mutexes are file-based and the default # mutex file directory is not on a local disk or is not appropriate for some # other reason. # # Mutex default:logs # # Listen: Allows you to bind Apache to specific IP addresses and/or # ports, instead of the default. See also the <VirtualHost> # directive. # # Change this to Listen on specific IP addresses as shown below to # prevent Apache from glomming onto all bound IP addresses. # #Listen 12.34.56.78:80 Listen 80 # # Dynamic Shared Object (DSO) Support # # To be able to use the functionality of a module which was built as a DSO you # have to place corresponding `LoadModule' lines at this location so the # directives contained in it are actually available _before_ they are used. # Statically compiled modules (those listed by `httpd -l') do not need # to be loaded here. # # Example: # LoadModule foo_module modules/mod_foo.so # LoadModule authn_file_module modules/mod_authn_file.so LoadModule authn_dbm_module modules/mod_authn_dbm.so LoadModule authn_anon_module modules/mod_authn_anon.so LoadModule authn_dbd_module modules/mod_authn_dbd.so LoadModule authn_socache_module modules/mod_authn_socache.so LoadModule authn_core_module modules/mod_authn_core.so LoadModule authz_host_module modules/mod_authz_host.so LoadModule authz_groupfile_module modules/mod_authz_groupfile.so LoadModule authz_user_module modules/mod_authz_user.so LoadModule authz_dbm_module modules/mod_authz_dbm.so LoadModule authz_owner_module modules/mod_authz_owner.so LoadModule authz_dbd_module modules/mod_authz_dbd.so LoadModule authz_core_module modules/mod_authz_core.so LoadModule authnz_ldap_module modules/mod_authnz_ldap.so LoadModule access_compat_module modules/mod_access_compat.so LoadModule auth_basic_module modules/mod_auth_basic.so LoadModule auth_form_module modules/mod_auth_form.so LoadModule auth_digest_module modules/mod_auth_digest.so LoadModule allowmethods_module modules/mod_allowmethods.so LoadModule file_cache_module modules/mod_file_cache.so LoadModule cache_module modules/mod_cache.so LoadModule cache_disk_module modules/mod_cache_disk.so LoadModule socache_shmcb_module modules/mod_socache_shmcb.so LoadModule socache_dbm_module modules/mod_socache_dbm.so LoadModule socache_memcache_module modules/mod_socache_memcache.so LoadModule dbd_module modules/mod_dbd.so LoadModule bucketeer_module modules/mod_bucketeer.so LoadModule dumpio_module modules/mod_dumpio.so LoadModule echo_module modules/mod_echo.so LoadModule case_filter_module modules/mod_case_filter.so LoadModule case_filter_in_module modules/mod_case_filter_in.so LoadModule buffer_module modules/mod_buffer.so LoadModule ratelimit_module modules/mod_ratelimit.so LoadModule reqtimeout_module modules/mod_reqtimeout.so LoadModule ext_filter_module modules/mod_ext_filter.so LoadModule request_module modules/mod_request.so LoadModule include_module modules/mod_include.so LoadModule filter_module modules/mod_filter.so LoadModule substitute_module modules/mod_substitute.so LoadModule sed_module modules/mod_sed.so LoadModule charset_lite_module modules/mod_charset_lite.so LoadModule deflate_module modules/mod_deflate.so LoadModule mime_module modules/mod_mime.so LoadModule ldap_module modules/mod_ldap.so LoadModule log_config_module modules/mod_log_config.so LoadModule log_debug_module modules/mod_log_debug.so LoadModule logio_module modules/mod_logio.so LoadModule env_module modules/mod_env.so LoadModule mime_magic_module modules/mod_mime_magic.so LoadModule cern_meta_module modules/mod_cern_meta.so LoadModule expires_module modules/mod_expires.so LoadModule headers_module modules/mod_headers.so LoadModule usertrack_module modules/mod_usertrack.so LoadModule unique_id_module modules/mod_unique_id.so LoadModule setenvif_module modules/mod_setenvif.so LoadModule version_module modules/mod_version.so LoadModule remoteip_module modules/mod_remoteip.so LoadModule proxy_module modules/mod_proxy.so LoadModule proxy_connect_module modules/mod_proxy_connect.so LoadModule proxy_ftp_module modules/mod_proxy_ftp.so LoadModule proxy_http_module modules/mod_proxy_http.so LoadModule proxy_fcgi_module modules/mod_proxy_fcgi.so LoadModule proxy_scgi_module modules/mod_proxy_scgi.so LoadModule proxy_ajp_module modules/mod_proxy_ajp.so LoadModule proxy_balancer_module modules/mod_proxy_balancer.so LoadModule proxy_express_module modules/mod_proxy_express.so LoadModule session_module modules/mod_session.so LoadModule session_cookie_module modules/mod_session_cookie.so LoadModule session_dbd_module modules/mod_session_dbd.so LoadModule slotmem_shm_module modules/mod_slotmem_shm.so LoadModule ssl_module modules/mod_ssl.so LoadModule lbmethod_byrequests_module modules/mod_lbmethod_byrequests.so LoadModule lbmethod_bytraffic_module modules/mod_lbmethod_bytraffic.so LoadModule lbmethod_bybusyness_module modules/mod_lbmethod_bybusyness.so LoadModule lbmethod_heartbeat_module modules/mod_lbmethod_heartbeat.so LoadModule unixd_module modules/mod_unixd.so LoadModule dav_module modules/mod_dav.so LoadModule status_module modules/mod_status.so LoadModule autoindex_module modules/mod_autoindex.so LoadModule info_module modules/mod_info.so LoadModule suexec_module modules/mod_suexec.so LoadModule cgi_module modules/mod_cgi.so LoadModule cgid_module modules/mod_cgid.so LoadModule dav_fs_module modules/mod_dav_fs.so LoadModule vhost_alias_module modules/mod_vhost_alias.so LoadModule negotiation_module modules/mod_negotiation.so LoadModule dir_module modules/mod_dir.so LoadModule actions_module modules/mod_actions.so LoadModule speling_module modules/mod_speling.so LoadModule userdir_module modules/mod_userdir.so LoadModule alias_module modules/mod_alias.so LoadModule rewrite_module modules/mod_rewrite.so <IfDefine JUSTTOMAKEAPXSHAPPY> LoadModule php4_module modules/libphp4.so LoadModule php5_module modules/libphp5.so </IfDefine> <IfModule unixd_module> # # If you wish httpd to run as a different user or group, you must run # httpd as root initially and it will switch. # # User/Group: The name (or #number) of the user/group to run httpd as. # It is usually good practice to create a dedicated user and group for # running httpd, as with most system services. # User root Group www </IfModule> # 'Main' server configuration # # The directives in this section set up the values used by the 'main' # server, which responds to any requests that aren't handled by a # <VirtualHost> definition. These values also provide defaults for # any <VirtualHost> containers you may define later in the file. # # All of these directives may appear inside <VirtualHost> containers, # in which case these default settings will be overridden for the # virtual host being defined. # # # ServerAdmin: Your address, where problems with the server should be # e-mailed. This address appears on some server-generated pages, such # as error documents. e.g. [email protected] # ServerAdmin [email protected] # # ServerName gives the name and port that the server uses to identify itself. # This can often be determined automatically, but we recommend you specify # it explicitly to prevent problems during startup. # # If your host doesn't have a registered DNS name, enter its IP address here. # #ServerName www.example.com:@@Port@@ # XAMPP ServerName localhost # # Deny access to the entirety of your server's filesystem. You must # explicitly permit access to web content directories in other # <Directory> blocks below. # <Directory /> AllowOverride none Require all denied </Directory> # # Note that from this point forward you must specifically allow # particular features to be enabled - so if something's not working as # you might expect, make sure that you have specifically enabled it # below. # # # DocumentRoot: The directory out of which you will serve your # documents. By default, all requests are taken from this directory, but # symbolic links and aliases may be used to point to other locations. # DocumentRoot "/opt/lampp/htdocs" <Directory "/opt/lampp/htdocs"> # # Possible values for the Options directive are "None", "All", # or any combination of: # Indexes Includes FollowSymLinks SymLinksifOwnerMatch ExecCGI MultiViews # # Note that "MultiViews" must be named *explicitly* --- "Options All" # doesn't give it to you. # # The Options directive is both complicated and important. Please see # http://httpd.apache.org/docs/trunk/mod/core.html#options # for more information. # #Options Indexes FollowSymLinks # XAMPP Options Indexes FollowSymLinks ExecCGI Includes # # AllowOverride controls what directives may be placed in .htaccess files. # It can be "All", "None", or any combination of the keywords: # Options FileInfo AuthConfig Limit # #AllowOverride None # since XAMPP 1.4: AllowOverride All # # Controls who can get stuff from this server. # Require all granted </Directory> # # DirectoryIndex: sets the file that Apache will serve if a directory # is requested. # <IfModule dir_module> #DirectoryIndex index.html # XAMPP DirectoryIndex index.html index.html.var index.php index.php3 index.php4 </IfModule> # # The following lines prevent .htaccess and .htpasswd files from being # viewed by Web clients. # <Files ".ht*"> Require all denied </Files> # # ErrorLog: The location of the error log file. # If you do not specify an ErrorLog directive within a <VirtualHost> # container, error messages relating to that virtual host will be # logged here. If you *do* define an error logfile for a <VirtualHost> # container, that host's errors will be logged there and not here. # ErrorLog "logs/error_log" # # LogLevel: Control the number of messages logged to the error_log. # Possible values include: debug, info, notice, warn, error, crit, # alert, emerg. # LogLevel warn <IfModule log_config_module> # # The following directives define some format nicknames for use with # a CustomLog directive (see below). # LogFormat "%h %l %u %t \"%r\" %>s %b \"%{Referer}i\" \"%{User-Agent}i\"" combined LogFormat "%h %l %u %t \"%r\" %>s %b" common <IfModule logio_module> # You need to enable mod_logio.c to use %I and %O LogFormat "%h %l %u %t \"%r\" %>s %b \"%{Referer}i\" \"%{User-Agent}i\" %I %O" combinedio </IfModule> # # The location and format of the access logfile (Common Logfile Format). # If you do not define any access logfiles within a <VirtualHost> # container, they will be logged here. Contrariwise, if you *do* # define per-<VirtualHost> access logfiles, transactions will be # logged therein and *not* in this file. # CustomLog "logs/access_log" common # # If you prefer a logfile with access, agent, and referer information # (Combined Logfile Format) you can use the following directive. # #CustomLog "logs/access_log" combined </IfModule> <IfModule alias_module> # # Redirect: Allows you to tell clients about documents that used to # exist in your server's namespace, but do not anymore. The client # will make a new request for the document at its new location. # Example: # Redirect permanent /foo http://www.example.com/bar # # Alias: Maps web paths into filesystem paths and is used to # access content that does not live under the DocumentRoot. # Example: # Alias /webpath /full/filesystem/path # # If you include a trailing / on /webpath then the server will # require it to be present in the URL. You will also likely # need to provide a <Directory> section to allow access to # the filesystem path. # # ScriptAlias: This controls which directories contain server scripts. # ScriptAliases are essentially the same as Aliases, except that # documents in the target directory are treated as applications and # run by the server when requested rather than as documents sent to the # client. The same rules about trailing "/" apply to ScriptAlias # directives as to Alias. # ScriptAlias /cgi-bin/ "/opt/lampp/cgi-bin/" </IfModule> <IfModule cgid_module> # # ScriptSock: On threaded servers, designate the path to the UNIX # socket used to communicate with the CGI daemon of mod_cgid. # #Scriptsock logs/cgisock </IfModule> # # "/opt/lampp/cgi-bin" should be changed to whatever your ScriptAliased # CGI directory exists, if you have that configured. # <Directory "/opt/lampp/cgi-bin"> AllowOverride None Options None Require all granted </Directory> <IfModule mime_module> # # TypesConfig points to the file containing the list of mappings from # filename extension to MIME-type. # TypesConfig etc/mime.types # # AddType allows you to add to or override the MIME configuration # file specified in TypesConfig for specific file types. # #AddType application/x-gzip .tgz # # AddEncoding allows you to have certain browsers uncompress # information on the fly. Note: Not all browsers support this. # #AddEncoding x-compress .Z #AddEncoding x-gzip .gz .tgz # # If the AddEncoding directives above are commented-out, then you # probably should define those extensions to indicate media types: # AddType application/x-compress .Z AddType application/x-gzip .gz .tgz # # AddHandler allows you to map certain file extensions to "handlers": # actions unrelated to filetype. These can be either built into the server # or added with the Action directive (see below) # # To use CGI scripts outside of ScriptAliased directories: # (You will also need to add "ExecCGI" to the "Options" directive.) # #AddHandler cgi-script .cgi # XAMPP, since LAMPP 0.9.8: AddHandler cgi-script .cgi .pl # For type maps (negotiated resources): #AddHandler type-map var # # Filters allow you to process content before it is sent to the client. # # To parse .shtml files for server-side includes (SSI): # (You will also need to add "Includes" to the "Options" directive.) # # XAMPP AddType text/html .shtml AddOutputFilter INCLUDES .shtml </IfModule> # # The mod_mime_magic module allows the server to use various hints from the # contents of the file itself to determine its type. The MIMEMagicFile # directive tells the module where the hint definitions are located. # #MIMEMagicFile etc/magic # # Customizable error responses come in three flavors: # 1) plain text 2) local redirects 3) external redirects # # Some examples: #ErrorDocument 500 "The server made a boo boo." #ErrorDocument 404 /missing.html #ErrorDocument 404 "/cgi-bin/missing_handler.pl" #ErrorDocument 402 http://www.example.com/subscription_info.html # # # MaxRanges: Maximum number of Ranges in a request before # returning the entire resource, or one of the special # values 'default', 'none' or 'unlimited'. # Default setting is to accept 200 Ranges. #MaxRanges unlimited # # EnableMMAP and EnableSendfile: On systems that support it, # memory-mapping or the sendfile syscall may be used to deliver # files. This usually improves server performance, but must # be turned off when serving from networked-mounted # filesystems or if support for these functions is otherwise # broken on your system. # Defaults: EnableMMAP On, EnableSendfile Off # EnableMMAP off EnableSendfile off # Supplemental configuration # # The configuration files in the etc/extra/ directory can be # included to add extra features or to modify the default configuration of # the server, or you may simply copy their contents here and change as # necessary. # Server-pool management (MPM specific) #Include etc/extra/httpd-mpm.conf # Multi-language error messages Include etc/extra/httpd-multilang-errordoc.conf # Fancy directory listings Include etc/extra/httpd-autoindex.conf # Language settings #Include etc/extra/httpd-languages.conf # User home directories #Include etc/extra/httpd-userdir.conf # Real-time info on requests and configuration #Include etc/extra/httpd-info.conf # Virtual hosts Include etc/extra/httpd-vhosts.conf # Local access to the Apache HTTP Server Manual #Include etc/extra/httpd-manual.conf # Distributed authoring and versioning (WebDAV) #Include etc/extra/httpd-dav.conf # Various default settings Include etc/extra/httpd-default.conf # Configure mod_proxy_html to understand HTML4/XHTML1 <IfModule proxy_html_module> Include etc/extra/proxy-html.conf </IfModule> # Secure (SSL/TLS) connections <IfModule ssl_module> # XAMPP <IfDefine SSL> Include etc/extra/httpd-ssl.conf </IfDefine> </IfModule> # # Note: The following must must be present to support # starting without SSL on platforms with no /dev/random equivalent # but a statically compiled-in mod_ssl. # <IfModule ssl_module> SSLRandomSeed startup builtin SSLRandomSeed connect builtin </IfModule> # XAMPP Include etc/extra/httpd-xampp.conf Include "/opt/lampp/apache2/conf/httpd.conf" I used command shown in this example. I used below lines to change and add group Add group "groupadd www" Add user to group "usermod -aG www root" Change htdocs group "chgrp -R www /opt/lampp/htdocs" Change sitedir group "chgrp -R www /opt/lampp/htdocs/mysite" Change htdocs chmod "chmod 2775 /opt/lampp/htdocs" Change sitedir chmod "chmod 2775 /opt/lampp/htdocs/mysite" And then I changed my vhosts.conf file # Virtual Hosts # # Required modules: mod_log_config # If you want to maintain multiple domains/hostnames on your # machine you can setup VirtualHost containers for them. Most configurations # use only name-based virtual hosts so the server doesn't need to worry about # IP addresses. This is indicated by the asterisks in the directives below. # # Please see the documentation at # <URL:http://httpd.apache.org/docs/2.4/vhosts/> # for further details before you try to setup virtual hosts. # # You may use the command line option '-S' to verify your virtual host # configuration. # # VirtualHost example: # Almost any Apache directive may go into a VirtualHost container. # The first VirtualHost section is used for all requests that do not # match a ServerName or ServerAlias in any <VirtualHost> block. # <VirtualHost *:80> ServerAdmin [email protected] DocumentRoot "/opt/lampp/docs/dummy-host.example.com" ServerName dummy-host.example.com ServerAlias www.dummy-host.example.com ErrorLog "logs/dummy-host.example.com-error_log" CustomLog "logs/dummy-host.example.com-access_log" common </VirtualHost> <VirtualHost *:80> ServerAdmin [email protected] DocumentRoot "/opt/lampp/docs/dummy-host2.example.com" ServerName dummy-host2.example.com ErrorLog "logs/dummy-host2.example.com-error_log" CustomLog "logs/dummy-host2.example.com-access_log" common </VirtualHost> NameVirtualHost * <VirtualHost *> ServerAdmin [email protected] DocumentRoot "/opt/lampp/htdocs/mysite" ServerName mysite.com ServerAlias mysite.com ErrorLog "/opt/lampp/htdocs/mysite/errorlogs" CustomLog "/opt/lampp/htdocs/mysite/customlog" common <Directory "/opt/lampp/htdocs/mysite"> Options Indexes FollowSymLinks Includes ExecCGI AllowOverride All Order Allow,Deny Allow from all Require all granted </Directory> </VirtualHost> but still its not working and I am getting 403 error on my ip and domain however I can access phpmyadmin. If anyone can help me, please help me.

    Read the article

  • JMS Step 2 - Using the QueueSend.java Sample Program to Send a Message to a JMS Queue

    - by John-Brown.Evans
    JMS Step 2 - Using the QueueSend.java Sample Program to Send a Message to a JMS Queue .c21_2{vertical-align:top;width:487.3pt;border-style:solid;border-color:#000000;border-width:1pt;padding:5pt 5pt 5pt 5pt} .c15_2{vertical-align:top;width:487.3pt;border-style:solid;border-color:#ffffff;border-width:1pt;padding:5pt 5pt 5pt 5pt} .c0_2{padding-left:0pt;direction:ltr;margin-left:36pt} .c20_2{list-style-type:circle;margin:0;padding:0} .c10_2{list-style-type:disc;margin:0;padding:0} .c6_2{background-color:#ffffff} .c17_2{padding-left:0pt;margin-left:72pt} .c3_2{line-height:1.0;direction:ltr} .c1_2{font-size:10pt;font-family:"Courier New"} .c16_2{color:#1155cc;text-decoration:underline} .c13_2{color:inherit;text-decoration:inherit} .c7_2{background-color:#ffff00} .c9_2{border-collapse:collapse} .c2_2{font-family:"Courier New"} .c18_2{font-size:18pt} .c5_2{font-weight:bold} .c19_2{color:#ff0000} .c12_2{background-color:#f3f3f3;border-style:solid;border-color:#000000;border-width:1pt;} .c14_2{font-size:24pt} .c8_2{direction:ltr;background-color:#ffffff} .c11_2{font-style:italic} .c4_2{height:11pt} .title{padding-top:24pt;line-height:1.15;text-align:left;color:#000000;font-size:36pt;font-family:"Arial";font-weight:bold;padding-bottom:6pt}.subtitle{padding-top:18pt;line-height:1.15;text-align:left;color:#666666;font-style:italic;font-size:24pt;font-family:"Georgia";padding-bottom:4pt} li{color:#000000;font-size:10pt;font-family:"Arial"} p{color:#000000;font-size:10pt;margin:0;font-family:"Arial"} h1{padding-top:0pt;line-height:1.15;text-align:left;color:#888;font-size:24pt;font-family:"Arial";font-weight:normal;padding-bottom:0pt} h2{padding-top:0pt;line-height:1.15;text-align:left;color:#888;font-size:18pt;font-family:"Arial";font-weight:normal;padding-bottom:0pt} h3{padding-top:0pt;line-height:1.15;text-align:left;color:#888;font-size:14pt;font-family:"Arial";font-weight:normal;padding-bottom:0pt} h4{padding-top:0pt;line-height:1.15;text-align:left;color:#888;font-size:12pt;font-family:"Arial";font-weight:normal;padding-bottom:0pt} h5{padding-top:0pt;line-height:1.15;text-align:left;color:#888;font-size:11pt;font-family:"Arial";font-weight:normal;padding-bottom:0pt} h6{padding-top:0pt;line-height:1.15;text-align:left;color:#888;font-size:10pt;font-family:"Arial";font-weight:normal;padding-bottom:0pt} This post is the second in a series of JMS articles which demonstrate how to use JMS queues in a SOA context. In the previous post JMS Step 1 - How to Create a Simple JMS Queue in Weblogic Server 11g I showed you how to create a JMS queue and its dependent objects in WebLogic Server. In this article, we will use a sample program to write a message to that queue. Please review the previous post if you have not created those objects yet, as they will be required later in this example. The previous post also includes useful background information and links to the Oracle documentation for addional research. The following post in this series will show how to read the message from the queue again. 1. Source code The following java code will be used to write a message to the JMS queue. It is based on a sample program provided with the WebLogic Server installation. The sample is not installed by default, but needs to be installed manually using the WebLogic Server Custom Installation option, together with many, other useful samples. You can either copy-paste the following code into your editor, or install all the samples. The knowledge base article in My Oracle Support: How To Install WebLogic Server and JMS Samples in WLS 10.3.x (Doc ID 1499719.1) describes how to install the samples. QueueSend.java package examples.jms.queue; import java.io.BufferedReader; import java.io.IOException; import java.io.InputStreamReader; import java.util.Hashtable; import javax.jms.*; import javax.naming.Context; import javax.naming.InitialContext; import javax.naming.NamingException; /** This example shows how to establish a connection * and send messages to the JMS queue. The classes in this * package operate on the same JMS queue. Run the classes together to * witness messages being sent and received, and to browse the queue * for messages. The class is used to send messages to the queue. * * @author Copyright (c) 1999-2005 by BEA Systems, Inc. All Rights Reserved. */ public class QueueSend { // Defines the JNDI context factory. public final static String JNDI_FACTORY="weblogic.jndi.WLInitialContextFactory"; // Defines the JMS context factory. public final static String JMS_FACTORY="jms/TestConnectionFactory"; // Defines the queue. public final static String QUEUE="jms/TestJMSQueue"; private QueueConnectionFactory qconFactory; private QueueConnection qcon; private QueueSession qsession; private QueueSender qsender; private Queue queue; private TextMessage msg; /** * Creates all the necessary objects for sending * messages to a JMS queue. * * @param ctx JNDI initial context * @param queueName name of queue * @exception NamingException if operation cannot be performed * @exception JMSException if JMS fails to initialize due to internal error */ public void init(Context ctx, String queueName) throws NamingException, JMSException { qconFactory = (QueueConnectionFactory) ctx.lookup(JMS_FACTORY); qcon = qconFactory.createQueueConnection(); qsession = qcon.createQueueSession(false, Session.AUTO_ACKNOWLEDGE); queue = (Queue) ctx.lookup(queueName); qsender = qsession.createSender(queue); msg = qsession.createTextMessage(); qcon.start(); } /** * Sends a message to a JMS queue. * * @param message message to be sent * @exception JMSException if JMS fails to send message due to internal error */ public void send(String message) throws JMSException { msg.setText(message); qsender.send(msg); } /** * Closes JMS objects. * @exception JMSException if JMS fails to close objects due to internal error */ public void close() throws JMSException { qsender.close(); qsession.close(); qcon.close(); } /** main() method. * * @param args WebLogic Server URL * @exception Exception if operation fails */ public static void main(String[] args) throws Exception { if (args.length != 1) { System.out.println("Usage: java examples.jms.queue.QueueSend WebLogicURL"); return; } InitialContext ic = getInitialContext(args[0]); QueueSend qs = new QueueSend(); qs.init(ic, QUEUE); readAndSend(qs); qs.close(); } private static void readAndSend(QueueSend qs) throws IOException, JMSException { BufferedReader msgStream = new BufferedReader(new InputStreamReader(System.in)); String line=null; boolean quitNow = false; do { System.out.print("Enter message (\"quit\" to quit): \n"); line = msgStream.readLine(); if (line != null && line.trim().length() != 0) { qs.send(line); System.out.println("JMS Message Sent: "+line+"\n"); quitNow = line.equalsIgnoreCase("quit"); } } while (! quitNow); } private static InitialContext getInitialContext(String url) throws NamingException { Hashtable env = new Hashtable(); env.put(Context.INITIAL_CONTEXT_FACTORY, JNDI_FACTORY); env.put(Context.PROVIDER_URL, url); return new InitialContext(env); } } 2. How to Use This Class 2.1 From the file system on UNIX/Linux Log in to a machine with a WebLogic installation and create a directory to contain the source and code matching the package name, e.g. $HOME/examples/jms/queue. Copy the above QueueSend.java file to this directory. Set the CLASSPATH and environment to match the WebLogic server environment. Go to $MIDDLEWARE_HOME/user_projects/domains/base_domain/bin  and execute . ./setDomainEnv.sh Collect the following information required to run the script: The JNDI name of a JMS queue to use In the Weblogic server console > Services > Messaging > JMS Modules > (Module name, e.g. TestJMSModule) > (JMS queue name, e.g. TestJMSQueue)Select the queue and note its JNDI name, e.g. jms/TestJMSQueue The JNDI name of a connection factory to connect to the queue Follow the same path as above to get the connection factory for the above queue, e.g. TestConnectionFactory and its JNDI namee.g. jms/TestConnectionFactory The URL and port of the WebLogic server running the above queue Check the JMS server for the above queue and the managed server it is targeted to, for example soa_server1. Now find the port this managed server is listening on, by looking at its entry under Environment > Servers in the WLS console, e.g. 8001 The URL for the server to be given to the QueueSend program in this example will therefore be t3://host.domain:8001 e.g. t3://jbevans-lx.de.oracle.com:8001 Edit QueueSend.java and enter the above queue name and connection factory respectively under ...public final static String  JMS_FACTORY=" jms/TestConnectionFactory "; ... public final static String QUEUE=" jms/TestJMSQueue "; ... Compile QueueSend.java using javac QueueSend.java Go to the source’s top-level directory and execute it using java examples.jms.queue.QueueSend t3://jbevans-lx.de.oracle.com:8001 This will prompt for a text input or “quit” to end. In the WLS console, go to the queue and select Monitoring to confirm that a new message was written to the queue. 2.2 From JDeveloper Create a new application in JDeveloper, called, for example JMSTests. When prompted for a project name, enter QueueSend and select Java as the technology Default Package = examples.jms.queue (but you can enter anything here as you will overwrite it in the code later). Leave the other values at their defaults. Press Finish Create a new Java class called QueueSend and use the default values This will create a file called QueueSend.java. Open QueueSend.java, if it is not already open and replace all its contents with the QueueSend java code listed above Some lines might have warnings due to unfound objects. These are due to missing libraries in the JDeveloper project. Add the following libraries to the JDeveloper project: right-click the QueueSend  project in the navigation menu and select Libraries and Classpath , then Add JAR/Directory  Go to the folder containing the JDeveloper installation and find/choose the file javax.jms_1.1.1.jar , e.g. at D:\oracle\jdev11116\modules\javax.jms_1.1.1.jar Do the same for the weblogic.jar file located, for example in D:\oracle\jdev11116\wlserver_10.3\server\lib\weblogic.jar Now you should be able to compile the project, for example by selecting the Make or Rebuild icons   If you try to execute the project, you will get a usage message, as it requires a parameter pointing to the WLS installation containing the JMS queue, for example t3://jbevans-lx.de.oracle.com:8001 . You can automatically pass this parameter to the program from JDeveloper by editing the project’s Run/Debug/Profile. Select the project properties, select Run/Debug/Profile and edit the Default run configuration and add the connection parameter to the Program Arguments field If you execute it again, you will see that it has passed the parameter to the start command If you get a ClassNotFoundException for the class weblogic.jndi.WLInitialContextFactory , then check that the weblogic.jar file was correctly added to the project in one of the earlier steps above. Set the values of JMS_FACTORY and QUEUE the same way as described above in the description of how to use this from a Linux file system, i.e. ...public final static String  JMS_FACTORY=" jms/TestConnectionFactory "; ... public final static String QUEUE=" jms/TestJMSQueue "; ... You need to make one more change to the project. If you execute it now, it will prompt for the payload for the JMS message, but you won’t be able to enter it by default in JDeveloper. You need to enable program input for the project first. Select the project’s properties, then Tool Settings, then check the Allow Program Input checkbox at the bottom and Save. Now when you execute the project, you will get a text entry field at the bottom into which you can enter the payload. You can enter multiple messages until you enter “quit”, which will cause the program to stop. The following screen shot shows the TestJMSQueue’s Monitoring page, after a message was sent to the queue: This concludes the sample. In the following post I will show you how to read the message from the queue again.

    Read the article

  • 7u10: JavaFX packaging tools update

    - by igor
    Last weeks were very busy here in Oracle. JavaOne 2012 is next week. Come to see us there! Meanwhile i'd like to quickly update you on recent developments in the area of packaging tools. This is an area of ongoing development for the team, and we are  continuing to refine and improve both the tools and the process. Thanks to everyone who shared experiences and suggestions with us. We are listening and fixed many of reported issues. Please keep them coming as comments on the blog or (even better) file issues directly to the JIRA. In this post i'll focus on several new packaging features added in JDK 7 update 10: Self-Contained Applications: Select Java Runtime to bundle Self-Contained Applications: Create Package without Java Runtime Self-Contained Applications: Package non-JavaFX application Option to disable proxy setup in the JavaFX launcher Ability to specify codebase for WebStart application Option to update existing jar file Self-Contained Applications: Specify application icon Self-Contained Applications: Pass parameters on the command line All these features and number of other important bug fixes are available in the developer preview builds of JDK 7 update 10 (build 8 or later). Please give them a try and share your feedback! Self-Contained Applications: Select Java Runtime to bundle Packager tools in 7u6 assume current JDK (based on java.home property) is the source for embedded runtime. This is useful simplification for many scenarios but there are cases where ability to specify what to embed explicitly is handy. For example IDE may be using fixed JDK to build the project and this is not the version you want to bundle into your application. To make it more flexible we now allow to specify location of base JDK explicitly. It is optional and if you do not specify it then current JDK will be used (i.e. this change is fully backward compatible). New 'basedir' attribute was added to <fx:platform> tag. Its value is location of JDK to be used. It is ok to point to either JRE inside the JDK or JDK top level folder. However, it must be JDK and not JRE as we need other JDK tools for proper packaging and it must be recent version of JDK that is bundled with JavaFX (i.e. Java 7 update 6 or later). Here are examples (<fx:platform> is part of <fx:deploy> task): <fx:platform basedir="${java.home}"/> <fx:platform basedir="c:\tools\jdk7"/> Hint: this feature enables you to use packaging tools from JDK 7 update 10 (and benefit from bug fixes and other features described below) to create application package with bundled FCS version of JRE 7 update 6. Self-Contained Applications: Create Package without Java Runtime This may sound a bit puzzling at first glance. Package without embedded Java Runtime is not really self-contained and obviously will not help with: Deployment on fresh systems. JRE need to be installed separately (and this step will require admin permissions). Possible compatibility issues due to updates of system runtime. However, these packages are much much smaller in size. If download size matters and you are confident that user have recommended system JRE installed then this may be good option to consider if you want to improve user experience for install and launch. Technically, this is implemented as an extension of previous feature. Pass empty string as value for 'basedir' attribute and this will be treated as request to not bundle Java runtime, e.g. <fx:platform basedir=""/> Self-Contained Applications: Package non-JavaFX application One of popular questions people ask about self-contained applications - can i package my Java application as self-contained application? Absolutely. This is true even for tools shipped with JDK 7 update 6. Simply follow steps for creating package for Swing application with integrated JavaFX content and they will work even if your application does not use JavaFX. What's wrong with it? Well, there are few caveats: bundle size is larger because JavaFX is bundled whilst it is not really needed main application jar needs to be packaged to comply to JavaFX packaging requirements(and this may be not trivial to achieve in your existing build scripts) javafx application launcher may not work well with startup logic of your application (for example launcher will initialize networking stack and this may void custom networking settings in your application code) In JDK 7 update 6 <fx:deploy> was updated to accept arbitrary executable jar as an input. Self-contained application package will be created preserving input jar as-is, i.e. no JavaFX launcher will be embedded. This does not help with first point above but resolves other two. More formally following assertions must be true for packaging to succeed: application can be launched as "java -jar YourApp.jar" from the command line  mainClass attribute of <fx:application> refers to application main class <fx:resources> lists all resources needed for the application To give you an example lets assume we need to create a bundle for application consisting of 3 jars:     dist/javamain.jar     dist/lib/somelib.jar    dist/morelibs/anotherlib.jar where javamain.jar has manifest with      Main-Class: app.Main     Class-Path: lib/somelib.jar morelibs/anotherlib.jar Here is sample ant code to package it: <target name="package-bundle"> <taskdef resource="com/sun/javafx/tools/ant/antlib.xml" uri="javafx:com.sun.javafx.tools.ant" classpath="${javafx.tools.ant.jar}"/> <fx:deploy nativeBundles="all" width="100" height="100" outdir="native-packages/" outfile="MyJavaApp"> <info title="Sample project" vendor="Me" description="Test built from Java executable jar"/> <fx:application id="myapp" version="1.0" mainClass="app.Main" name="MyJavaApp"/> <fx:resources> <fx:fileset dir="dist"> <include name="javamain.jar"/> <include name="lib/somelib.jar"/> <include name="morelibs/anotherlib.jar"/> </fx:fileset> </fx:resources> </fx:deploy> </target> Option to disable proxy setup in the JavaFX launcher Since JavaFX 2.2 (part of JDK 7u6) properly packaged JavaFX applications  have proxy settings initialized according to Java Runtime configuration settings. This is handy for most of the application accessing network with one exception. If your application explicitly sets networking properties (e.g. socksProxyHost) then they must be set before networking stack is initialized. Proxy detection will initialize networking stack and therefore your custom settings will be ignored. One way to disable proxy setup by the embedded JavaFX launcher is to pass "-Djavafx.autoproxy.disable=true" on the command line. This is good for troubleshooting (proxy detection may cause significant startup time increases if network is misconfigured) but not really user friendly. Now proxy setup will be disabled if manifest of main application jar has "JavaFX-Feature-Proxy" entry with value "None". Here is simple example of adding this entry using <fx:jar> task: <fx:jar destfile="dist/sampleapp.jar"> <fx:application refid="myapp"/> <fx:resources refid="myresources"/> <fileset dir="build/classes"/> <manifest> <attribute name="JavaFX-Feature-Proxy" value="None"/> </manifest> </fx:jar> Ability to specify codebase for WebStart application JavaFX applications do not need to specify codebase (i.e. absolute location where application code will be deployed) for most of real world deployment scenarios. This is convenient as application does not need to be modified when it is moved from development to deployment environment. However, some developers want to ensure copies of their application JNLP file will redirect to master location. This is where codebase is needed. To avoid need to edit JNLP file manually <fx:deploy> task now accepts optional codebase attribute. If attribute is not specified packager will generate same no-codebase files as before. If codebase value is explicitly specified then generated JNLP files (including JNLP content embedded into web page) will use it.  Here is an example: <fx:deploy width="600" height="400" outdir="Samples" codebase="http://localhost/codebaseTest" outfile="TestApp"> .... </fx:deploy> Option to update existing jar file JavaFX packaging procedures are optimized for new application that can use ant or command line javafxpackager utility. This may lead to some redundant steps when you add it to your existing build process. One typical situation is that you might already have a build procedure that produces executable jar file with custom manifest. To properly package it as JavaFX executable jar you would need to unpack it and then use javafxpackager or <fx:jar> to create jar again (and you need to make sure you pass all important details from your custom manifest). We added option to pass jar file as an input to javafxpackager and <fx:jar>. This simplifies integration of JavaFX packaging tools into existing build  process as postprocessing step. By the way, we are looking for ways to simplify this further. Please share your suggestions! On the technical side this works as follows. Both <fx:jar> and javafxpackager will attempt to update existing jar file if this is the only input file. Update process will add JavaFX launcher classes and update the jar manifest with JavaFX attributes. Your custom attributes will be preserved. Update could be performed in place or result may be saved to a different file. Main-Class and Class-Path elements (if present) of manifest of input jar file will be used for JavaFX application  unless they are explicitly overriden in the packaging command you use. E.g. attribute mainClass of <fx:application> (or -appclass in the javafxpackager case) overrides existing Main-Class in the jar manifest. Note that class specified in the Main-Class attribute could either extend JavaFX Application or provide static main() method. Here are examples of updating jar file using javafxpackager: Create new JavaFX executable jar as a copy of given jar file javafxpackager -createjar -srcdir dist -srcfiles fish_proto.jar -outdir dist -outfile fish.jar  Update existing jar file to be JavaFX executable jar and use test.Fish as main application class javafxpackager -createjar -srcdir dist -appclass test.Fish -srcfiles fish.jar -outdir dist -outfile fish.jar  And here is example of using <fx:jar> to create new JavaFX executable jar from the existing fish_proto.jar: <fx:jar destfile="dist/fish.jar"> <fileset dir="dist"> <include name="fish_proto.jar"/> </fileset> </fx:jar> Self-Contained Applications: Specify application icon The only way to specify application icon for self-contained application using tools in JDK 7 update 6 is to use drop-in resources. Now this bug is resolved and you can also specify icon using <fx:icon> tag. Here is an example: <fx:deploy ...> <fx:info> <fx:icon href="default.png"/> </fx:info> ... </fx:deploy> Few things to keep in mind: Only default kind of icon is applicable to self-contained applications (as of now) Icon should follow platform specific rules for sizes and image format (e.g. .ico on Windows and .icns on Mac) Self-Contained Applications: Pass parameters on the command line JavaFX applications support two types of application parameters: named and unnamed (see the API for Application.Parameters). Static named parameters can be added to the application package using <fx:param> and unnamed parameters can be added using <fx:argument>. They are applicable to all execution modes including standalone applications. It is also possible to pass parameters to a JavaFX application from a Web page that hosts it, using <fx:htmlParam>.  Prior to JavaFX 2.2, this was only supported for embedded applications. Starting from JavaFX 2.2, <fx:htmlParam> is applicable to Web Start applications also. See JavaFX deployment guide for more details on this. However, there was no way to pass dynamic parameters to the self-contained application. This has been improved and now native launchers will  delegate parameters from command line to the application code. I.e. to pass parameter to the application you simply need to run it as "myapp.exe somevalue" and then use getParameters().getUnnamed().get(0) to get "somevalue".

    Read the article

  • CodePlex Daily Summary for Friday, March 16, 2012

    CodePlex Daily Summary for Friday, March 16, 2012Popular ReleasesJavascript .NET: Javascript .NET v0.6: Upgraded to the latest stable branch of v8 (/tags/3.9.18), and switched to using their scons build system. We no longer include v8 source code as part of this project's source code. Simultaneous multithreaded use of v8 now supported (v8 Isolates), although different contexts may not share objects or call each other. 64-bit .Net 4.0 DLL now included. (Download now includes x86 and x64 for both .Net 3.5 and .Net 4.0.)MyRouter (Virtual WiFi Router): MyRouter 1.0.6: This release should be more stable there were a few bug fixes including the x64 issue as well as an error popping up when MyRouter started this was caused by a NULL valuePulse: Pulse Beta 4: This version is still in development but should include: Logging and error handling have been greatly improved. If you run into an error or Pulse crashes make sure to check the Log folder for a recently modified log file so you can report the details of the issue A bunch of new features for the Wallbase.cc provider. Cleaner separation between inputs, downloading and output. Input and downloading are fairly clean now but outputs are still mixed up in the mix which I'm trying to resolve ...Google Books Downloader for Windows: Google Books Downloader-2.0.0.0.: Google Books DownloaderFinestra Virtual Desktops: 2.5.4501: This is a very minor update release. Please see the information about the 2.5 and 2.5.4500 releases for more information on recent changes. This update did not even have an automatic update triggered for it. Adds error checking and reporting to all threads, not only those with message loopsAcDown????? - Anime&Comic Downloader: AcDown????? v3.9.2: ?? ●AcDown??????????、??、??????,????1M,????,????,?????????????????????????。???????????Acfun、????(Bilibili)、??、??、YouTube、??、???、??????、SF????、????????????。??????AcPlay?????,??????、????????????????。 ● AcDown???????????????????????????,???,???????????????????。 ● AcDown???????C#??,????.NET Framework 2.0??。?????"Acfun?????"。 ????32??64? Windows XP/Vista/7/8 ????????????? ??:????????Windows XP???,?????????.NET Framework 2.0???(x86),?????"?????????"??? ??????????????,??????????: ??"AcDo...ArcGIS Editor for OpenStreetMap: ArcGIS Editor for OSM 2.0 Release Candidate: Your feedback is welcome - and this is your last chance to get your fixes in for this version! Includes installer for both Feature Server extension and Desktop extension, enhanced functionality for the Desktop tools, and enhanced built-in Javascript Editor for the Feature Server component. This release candidate includes fixes to beta 4 that accommodate domain users for setting up the Server Component, and fixes for reporting/uploading references tracked in the revision table. See Code In-P...C.B.R. : Comic Book Reader: CBR 0.6: 20 Issue trackers are closed and a lot of bugs too Localize view is now MVVM and delete is working. Added the unused flag (take care that it goes to true only when displaying screen elements) Backstage - new input/output format choice control for the conversion Backstage - Add display, behaviour and register file type options in the extended options dialog Explorer list view has been transformed to a custom control. New group header, colunms order and size are saved Single insta...Windows Azure Toolkit for Windows 8: Windows Azure Toolkit for Windows 8 Consumer Prv: Windows Azure Toolkit for Windows 8 Consumer Preview - Preview Release v1.2.1Minor updates to setup experience: Check for WebPI before install Dependency Check updated to support the following VS 11 and VS 2010 SKUs Ultimate, Premium, Professional and Express Certs Windows Azure Toolkit for Windows 8 Consumer Preview - Preview Release v1.2.0 Please download this for Windows Azure Toolkit for Windows 8 functionality on Windows 8 Consumer Preview. The core features of the toolkit include:...Facebook Graph Toolkit: Facebook Graph Toolkit 3.0: ships with JSON Toolkit v3.0, offering parse speed up to 10 times of last version supports Facebook's new auth dialog supports new extend access token endpoint new example Page Tab app filter Graph Api connections using dates fixed bugs in Page Tab appsCODE Framework: 4.0.20312.0: This version includes significant improvements in the WPF system (and the WPF MVVM/MVC system). This includes new styles for Metro controls and layouts. Improved color handling. It also includes an improved theme/style swapping engine down to active (open) views. There also are various other enhancements and small fixes throughout the entire framework.ScintillaNET: ScintillaNET 2.4: 3/12/2012 Jacob Slusser Added support for annotations. Issues Fixed with this Release Issue # Title 25012 25012 25018 25018 25023 25023 25014 25014 Visual Studio ALM Quick Reference Guidance: v3 - For Visual Studio 11: RELEASE README Welcome to the BETA release of the Quick Reference Guide preview As this is a BETA release and the quality bar for the final Release has not been achieved, we value your candid feedback and recommend that you do not use or deploy these BETA artifacts in a production environment. Quality-Bar Details Documentation has been reviewed by Visual Studio ALM Rangers Documentation has not been through an independent technical review Documentation ...AvalonDock: AvalonDock 2.0.0345: Welcome to early alpha release of AvalonDock 2.0 I've completely rewritten AvalonDock in order to take full advantage of the MVVM pattern. New version also boost a lot of new features: 1) Deep separation between model and layout. 2) Full WPF binding support thanks to unified logical tree between main docking manager, auto-hide windows and floating windows. 3) Support for Aero semi-maximized windows feature. 4) Support for multiple panes in the same floating windows. For a short list of new f...Windows Azure PowerShell Cmdlets: Windows Azure PowerShell Cmdlets 2.2.2: Changes Added Start Menu Item for Easy Startup Added Link to Getting Started Document Added Ability to Persist Subscription Data to Disk Fixed Get-Deployment to not throw on empty slot Simplified numerous default values for cmdlets Breaking Changes: -SubscriptionName is now mandatory in Set-Subscription. -DefaultStorageAccountName and -DefaultStorageAccountKey parameters were removed from Set-Subscription. Instead, when adding multiple accounts to a subscription, each one needs to be added ...IronPython: 2.7.2.1: On behalf of the IronPython team, I'm happy to announce the final release IronPython 2.7.2. This release includes everything from IronPython 54498 and 62475 as well. Like all IronPython 2.7-series releases, .NET 4 is required to install it. Installing this release will replace any existing IronPython 2.7-series installation. Unlike previous releases, the assemblies for all supported platforms are included in the installer as well as the zip package, in the "Platforms" directory. IronPython 2...Kooboo CMS: Kooboo CMS 3.2.0.0: Breaking changes: When upgrade from previous versions, MUST reset the all the content type templates, otherwise the content manager might get a compile error. New features Integrate with Windows azure. See: http://wiki.kooboo.com/?wiki=Kooboo CMS on Azure Complete solution to deploy on load balance servers. See: http://wiki.kooboo.com/?wiki=Kooboo CMS load balance Update Jquery and Jquery ui to the lastest version(Jquery 1.71, Jquery UI 1.8.16). Tree style text content editing. See:h...Home Access Plus+: v7.10: Don't forget to add your location to the list: http://www.nbdev.co.uk/projects/hap/locations.aspx Changes: Added: CompressJS controls to the Help Desk & Booking System (reduces page size) Fixed: Debug/Release mode detection in CompressJS control Added: Older Browsers will use an iframe and the old uploadh.aspx page (works better than the current implementation on older browsers) Added: Permalinks for my files, you can give out links that redirect to the correct location when you log i...Extensions for Reactive Extensions (Rxx): Rxx 1.3: Please read the latest release notes for details about what's new. Related Work Items Content SummaryRxx provides the following features. See the Documentation for details. Many IObservable<T> extension methods and IEnumerable<T> extension methods. Many wrappers that convert asynchronous Framework Class Library APIs into observables. Many useful types such as ListSubject<T>, DictionarySubject<T>, CommandSubject, ViewModel, ObservableDynamicObject, Either<TLeft, TRight>, Maybe<T>, Scala...Player Framework by Microsoft: Player Framework for Windows 8 Metro (Preview): Player Framework for HTML/JavaScript and XAML/C# Metro Style Applications. Additional DownloadsIIS Smooth Streaming Client SDK for Windows 8New Projects4B12: Esperimenti con la classe 4B - ITIS RiminiAmbroisie: Personal projectAssembly Comparer: This project is mean to develop for those who work on different library on daily basis. This application will compare two folder with different DLL version information. Suppose one folder contain DLL with version 1.5.1.10 and another with 1.5.1.11 then this application will find out such mismatch library version and let you know. Next step is to update your latest library. you can overwrite old library from source location to target location with single click. All latest library from s...AutoFakes: AutoFakes makes it easier for developers to automatically build classes when testing. You'll no longer have to manually call your class-under-test's constructor, passing it individual stubs - AutoFakes handles that for you... automatically. AutoFakes is developed in C#.AutoSPEditor: AutoSPEditor is a graphical editor for AutoSPInstaller Configuration Files. It allows to download the prerequisites for a SharePoint installation, to configure the AutoSPInstaller input files using a graphical user interface and to create a deployment package. AutoSubmitter: ????????autowb: auto-wb auto-wb auto-wbChina Sail Factory - Online System: China sail factory online system, written in VB.NetConnection Strings Class for .NET Application: This class helps you use connection string for .NET application (C# or VB) based on ConnectionStrings.comCustoms Atom: Customs AtomEmail Notification Service with Publish/Subscribe pattern: This project introduces a simple windows service that can be used to build email notification intrastructure to handle all the email or other notification need for your applications and systems. Even Worse Minton Manager: The Even Worse Minton Manager is a website where you can create Badminton events and invite people. FnSharp - A compliment to F#: The FnSharp framework provides BCL enhancements and frameworks aimed specifically at improving F# developers lives.HMC6343 WindowsForm and FezSpider: The project is a Windows Form that communicates (using XBee S1) to a FezSpider that has one button and a GHIElectronics.XBee module. The Honeywell HMC6343 compass and I2C pullup resistors are located on a GHIElectronics DuinoProto board. I am using Microsoft C# Express.Intégration en Continue (Continious Integration): Intégration en Continue (Continious Integration) is a french communauty project to provide a set of tools with TFS and this methodology approach. jandanFunnyPic: ?????WP7???ModularAI: Artificial simulation framework with an emphasis on modular expansion.NewStart: NewStart is a start menu for Windows 8. It's written in C# with WindowsForm.Office Integration Pack: The Office Integration Pack is a LightSwitch extension that makes it easy to manipulate the 2010 versions of Excel, Word and Outlook in a variety of ways. Create documents, PDFs, spreadsheets, email and appointments from your LightSwitch application.PYTHON OVERLAY: A python library QHome: QHome By DDDrekop: Rekop is designed to be designed as designed.TestProject#532: My first test TFS projecttesttom03152012hg01: testtom03152012hg01testtom03152012hg02: testtom03152012hg02testtom03152012tfs02: testtom03152012tfs02TrogsoftIRC: This is a .NET 3.5 IRC library, intended to provide access to internet relay chat from .NET languages. The library is written in C#Visual Studio Data Generators: Visual Studio Data Generators is a collection of custom data generators for the Data Generation Plan feature of Visual Studio 2010 Premium and Ultimate. It generates random, valid data in several formats: URLs, emails, telephones and so on.WebMatrixColorizer: WebMatrix 2.0 supports code color theming but uses a different .XML file than Visual Studio's. This simple app converts a .vssettings file into a color scheme importable by WebMatrix. Export from VS or download from StudioStyl.es, then convert and import. Want a dark theme? Easy!Wetenschap & Wiskunde Toets-programma: lol

    Read the article

  • CodePlex Daily Summary for Thursday, March 29, 2012

    CodePlex Daily Summary for Thursday, March 29, 2012Popular ReleasesWouter's SharePoint Demo Land: Custom Views and Forms: Samples showing how to provision custom lists. Custom view rendering with code, XSLT, XSLT + code. Custom forms with database lookups.YouTube API Class & Server Control for ASP.NET 4.0: Binary dll Files with Demo: . You can find binary dll files under :- Demo\bin\ folder Demo url :- http://demo.svinn.com/YouTube/Default.aspx 1 GB Hosting + Free Domain => http://ghosting.in/ .callisto: callisto 2.0.23: Patched Script static class and peak user count bug fix.Circuit Diagram: Circuit Diagram 2.0 Alpha 3: New in this release: Added components: Microcontroller Demultiplexer Flip & rotate components Open XML files from older versions of Circuit Diagram Text formatting for components New CDDX syntax Other fixesUmbraco CMS: Umbraco 5.1 CMS (Beta): Beta build for testing - please report issues at issues.umbraco.org (Latest uploaded: 5.1.0.123) What's new in 5.1? The full list of changes is on our http://progress.umbraco.org task tracking page. It shows items complete for 5.1, and 5.1 includes items for 5.0.1 and 5.0.2 listed there too. Here's two headline acts: Members5.1 adds support for backoffice editing of Members. We support the pairing up of our content type system in Hive with regular ASP.NET Membership providers (we ship a def...51Degrees.mobi - Mobile Device Detection and Redirection: 2.1.2.11: One Click Install from NuGet Changes to Version 2.1.2.11Code Changes 1. The project is now licenced under the Mozilla Public Licence 2. 2. User interface control and associated data access layer classes have been added to aid developers integrating 51Degrees.mobi into wider projects such as content management systems or web hosting management solutions. Use the following in a web form or user control to access these new UI components. <%@ Register Assembly="FiftyOne.Foundation" Namespace="...JSON Toolkit: JSON Toolkit 3.1: slight performance improvement (5% - 10%) new JsonException classPicturethrill: Version 2.3.28.0: Straightforward image selection. New clean UI look. Super stable. Simplified user experience.Indent Guides for Visual Studio: Indent Guides v12 (beta 2): Note This beta is likely to be less stable than the previous one. If you have severe troubles using this version, please report them with as much detail as possible (especially other extensions/addins that you may have) and downgrade to the last stable release. Version History Changed since v12 (beta 1): new options dialog with Quick Set selections for behavior restructured settings storage (should be more reliable) asynchronous background document analysis glow effect now appears in p...SQL Monitor - managing sql server performance: SQL Monitor 4.2 alpha 16: 1. finally fixed problem with logic fault checking for temporary table name... I really mean finally ...ScintillaNET: ScintillaNET 2.5: A slew of bug-fixes with a few new features sprinkled in. This release also upgrades the SciLexer and SciLexer64 DLLs to version 3.0.4. The official stuff: Issue # Title 32402 32402 27137 27137 31548 31548 30179 30179 24932 24932 29701 29701 31238 31238 26875 26875 30052 30052 Mugen MVVM Toolkit: Mugen MVVM Toolkit ver 1.1: Added Design mode support.Multiwfn: Multiwfn 2.3.2: Multiwfn 2.3.2Harness: Harness 2.0.2: change to .NET Framework Client Profile bug fix the download dialog auto answer. bug fix setFocus command. add "SendKeys" command. remove "closeAll" command. minor bugs fixed.BugNET Issue Tracker: BugNET 0.9.161: Below is a list of fixes in this release. Bug BGN-2092 - Link in Email "visit your profile" not functional BGN-2083 - Manager of bugnet can not edit project when it is not public BGN-2080 - clicking on a link in the project summary causes error (0.9.152.0) BGN-2070 - Missing Functionality On Feed.aspx BGN-2069 - Calendar View does not work BGN-2068 - Time tracking totals not ok BGN-2067 - Issues List Page Size Bug: Index was out of range. Must be non-negative and less than the si...YAF.NET (aka Yet Another Forum.NET): v1.9.6.1 RTW: v1.9.6.1 FINAL is .NET v4.0 ONLY v1.9.6.1 has: Performance Improvements .NET v4.0 improvements Improved FaceBook Integration KNOWN ISSUES WITH THIS RELEASE: ON INSTALL PLEASE DON'T CHECK "Upgrade BBCode Extensions...". More complete change list and discussion here: http://forum.yetanotherforum.net/yaf_postst14201_v1-9-6-1-RTW-Dated--3-26-2012.aspxmenu4web: menu4web 0.0.3: menu4web 0.0.3ArcGIS Editor for OpenStreetMap: ArcGIS Editor for OSM 2.0 Final: This release installs both the ArcGIS Editor for OSM Server Component and/or ArcGIS Editor for OSM Desktop components. The Desktop tools allow you to download data from the OpenStreetMap servers and store it locally in a geodatabase. You can then use the familiar editing environment of ArcGIS Desktop to create, modify, or delete data. Once you are done editing, you can post back the edit changes to OSM to make them available to all OSM users. The Server Component allows you to quickly create...Craig's Utility Library: Craig's Utility Library 3.1: This update adds about 60 new extension methods, a couple of new classes, and a number of fixes including: Additions Added DateSpan class Added GenericDelimited class Random additions Added static thread friendly version of Random.Next called ThreadSafeNext. AOP Manager additions Added Destroy function to AOPManager (clears out all data so system can be recreated. Really only useful for testing...) ORM additions Added PagedCommand and PageCount functions to ObjectBaseClass (same as M...DotSpatial: DotSpatial 1.1: This is a Minor Release. See the changes in the issue tracker. Minimal -- includes DotSpatial core and essential extensions Extended -- includes debugging symbols and additional extensions Just want to run the software? End user (non-programmer) version available branded as MapWindow Want to add your own feature? Develop a plugin, using the template and contribute to the extension feed (you can also write extensions that you distribute in other ways). Components are available as NuGet pa...New ProjectsAeroFichierAchats: Program that reworks excel files into PDFArepa: A lightweight non-invasive tool that helps you to implement Behavior Driven Development (BDD) on .NET projects. Arepa produces guidelines of using BDD on your current tests, and customisable and portable test reports integrating XML Documentation Comments. It's developed in C# using Scrum and BDD.Azure User Management Console - AUMC: Azure User Management Console - AUMC is a User Graphic Interface (GUI) that manages the users and logins of an Azure SQL database. The tool is simply converting your action into T-SQL commands and execute them on the Azure SQL Database. BaseProject: BaseProjectCraftBukkit Updater: I don't activly monitor the status of CraftBukkit's updates, but I want to stay on top of the updated versions. This will track the last version in cache then check for updates. Upon update there should be an option to download automaticly to location XYZ or to send an email/sms.Dan Dot Com: Dan's SandboxEjemplos Windows 8 Javascript: Este proyecto pretende recoger todos los ejemplos de aplicaciones metro para Windows 8 en JavaScript que vaya desarrollando para fines didácticos. Todas las colaboraciones son bienvenidas.EntityFramework Generic Patterns: EntityFramework Generic PatternsEpFamvir: EpFamvir is a Visual C++ software framework that supports data-intensive distributed applications under a GPL3.0 license. It enables applications to work with thousands of nodes and petabytes of data. Famvir was inspired by Apache Hadoop, Google's MapReduce, and Google FS.etdc/etp: ???????????????? ???????? ??????? etest.ru. IndieCiv: IndieCiv - An independent look at how a new game could be made using fan-made graphics from Civ3. You can visit the discussion at www.civfanatics.com http://forums.civfanatics.com/showthread.php?t=421582Libro SQL Server 2012: Databases and examples for my Italian book on "SQL Server 2012"Linq2Rest: Parses OData system query parameters to create a LINQ query that can be used to filter a model set. Also exposes a LINQ provider for web services supporting the OData query parameters. Use extension method Filter (in Linq2Rest namespace) on any IEnumerable source.LittleUmph: It's a little library to help you to alleviate some of the mundane stuff during your development. It has some nifty stuff like a neat database wrapper, conversion utilities, string functions and vast of other mini helpers to improve the efficiency and consistency of your code.LonghornRPG: LonghornRPGMemberManager: Keep track of membersMSU Student Teaching DataBase: A C# application to manage student placements for MSU.MvcContrib4: This is the MvcContrib project to support MVC 4MYFIRSTDEMOPROJECT: Introduction to CSharpOrchard Page Title Override: This Orchard CMS module allows a user to manage the Page Title. You can now have the Site Name show up last in the Page Title or hide the Site Name completely from the Page Title. The module also contains a Content Part allowing a Page Title to be separate from content title.phoenixtree: .net linux sql php python html javascript c exampleProject Explorer for Notepad++: This project mainly aims at providing a functional rich project explorer experience on notepad++. Features include Folder based file browsing, searching, filtering and more.Projeto Exemplo FPU: Projeto Exemplo FPUPureCalendar: simple web calendarrails study: rails studySafe IM: a safe IM software use AES,SHA,RSA.Contain a server and a clientScoreboard: Scoreboard is a simple sports and/or activities scoreboard which can be used for a variety of purposes It was developed for use with a 1024x768 projector and features home and away scores, countdown clock, even a buzzer. It's developed in VB.net.SharePoint System dates and editor fields updater: The SPChangeDate is a utility used to modify some of the system fields in a SharePoint 2010 document library using a datagridview (Excel Like). It allows the modification of the following SharePoint fields: 1. Created 2. Modified 3. Modified By 4. Created By Shi Ji Xiang CRM 2012: Shi Ji XiangSIGECAR: SIGECARStable Dependencies Principle Checker for nDepend: Small console app for parsing nDepend output files and find assemblies that breaks the Stable Dependencies PrincipleStudentsPointManager: StudentsPointManagertclh123test: 4testThai Sign Language Translator: Thai Sign Language Translator with Kinect Sensor.The House FM / My Praise FM Desktop App: An App to play Christian radio on the desktop of windows Vista and Windows 7 computersTradeSea: Uni ProjectVisual Dependency Tracer: This tool will help you understand how an excel formula is derived/calculated. It provides a visual representation (tree) of the formula, including the precendents and dependents.VK Video Player: Video player for russian social network vk.comXrmSvcToolkit: XrmSvcToolkit is a small JavaScript library that helps you access Microsoft Dynamics CRM 2011 web service interfaces (SOAP and REST).

    Read the article

  • DirectX works for 64-bit but not 32-bit

    - by dtbarne
    I'm trying to play a game (Civilization 5) which was previously working but no longer. I believe I've narrowed it down to a DirectX issue because I get an error running dxdiag.exe in 32 bit mode. My goal (at least I believe) is to get Direct3D Acceleration "Enabled" in dxdiag (as it is in 64 bit dxdiag). A very similar issue is here: http://answers.microsoft.com/en-us/windows/forum/windows_7-gaming/direct3d-acceleration-is-not-available-in-windows/4c345e6e-dc68-e011-8dfc-68b599b31bf5?page=1 The proposed answer, which looks very promising, doesn't seem to work for me. Like other users in that thread, HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\Direct3D\Drivers does not have a SoftwareOnly key to change. I even tried manually adding it as a string and dword, to no avail. I have a NVIDIA GeForce GT 525M, and before you ask, yes I've tried updating (also uninstalling, reinstalling) my drivers. I've also tried doing the same with DirectX (and Civilization 5 for that matter). Been debugging for some 4+ hours now after a full day of work and I've run out of ideas. I'm hoping somebody knows the solution here! :) Here's what I see when I open dxdiag: DxDiag has detected that there mgiht have been a problem accessing Direct3D the last time this program was used. Would you like to bypass Direct3D this time? No - Crash Yes - Works, but in Display tab: DirectDraw Acceleration: Disabled Direct3D Acceleration: Not Available AGP Texture Acceleration: Not Available If I click "Run 64-bit DxDiag", all three are "Enabled". I should also note that I've tried the following steps as Microsoft suggests, but I'm not able to do so as the "Change Settings" button is disabled. Some programs run very slowly—or not at all—unless Microsoft DirectDraw or Direct3D hardware acceleration is turned on. To determine this, click the Display tab, and then under DirectX Features, check to see whether DirectDraw, Direct3D, and AGP Texture Acceleration appear as Enabled. If not, try turning on hardware acceleration. Click to open Screen Resolution. Click Advanced settings. Click the Troubleshoot tab, and then click Change settings. If you're prompted for an administrator password or confirmation, type the password or provide confirmation. Move the Hardware Acceleration slider to Full. Full dxdiag dump: ------------------ System Information ------------------ Time of this report: 11/8/2012, 23:13:24 Machine name: DTBARNE Operating System: Windows 7 Professional 64-bit (6.1, Build 7601) Service Pack 1 (7601.win7sp1_gdr.120830-0333) Language: English (Regional Setting: English) System Manufacturer: Dell Inc. System Model: Dell System XPS L502X BIOS: Default System BIOS Processor: Intel(R) Core(TM) i5-2450M CPU @ 2.50GHz (4 CPUs), ~2.5GHz Memory: 8192MB RAM Available OS Memory: 8086MB RAM Page File: 2466MB used, 13704MB available Windows Dir: C:\Windows DirectX Version: DirectX 11 DX Setup Parameters: Not found User DPI Setting: Using System DPI System DPI Setting: 96 DPI (100 percent) DWM DPI Scaling: Disabled DxDiag Version: 6.01.7601.17514 32bit Unicode DxDiag Previously: Crashed in Direct3D (stage 2). Re-running DxDiag with "dontskip" command line parameter or choosing not to bypass information gathering when prompted might result in DxDiag successfully obtaining this information ------------ DxDiag Notes ------------ Display Tab 1: No problems found. Sound Tab 1: No problems found. Sound Tab 2: No problems found. Input Tab: No problems found. -------------------- DirectX Debug Levels -------------------- Direct3D: 0/4 (retail) DirectDraw: 0/4 (retail) DirectInput: 0/5 (retail) DirectMusic: 0/5 (retail) DirectPlay: 0/9 (retail) DirectSound: 0/5 (retail) DirectShow: 0/6 (retail) --------------- Display Devices --------------- Card name: Intel(R) HD Graphics 3000 Manufacturer: Chip type: DAC type: Device Key: Enum\PCI\VEN_8086&DEV_0126&SUBSYS_04B61028&REV_09 Display Memory: Dedicated Memory: n/a Shared Memory: n/a Current Mode: 1920 x 1080 (32 bit) (60Hz) Monitor Name: Generic PnP Monitor Monitor Model: Monitor Id: Native Mode: Output Type: Driver Name: Driver File Version: () Driver Version: DDI Version: Driver Model: WDDM 1.1 Driver Attributes: Final Retail Driver Date/Size: , 0 bytes WHQL Logo'd: n/a WHQL Date Stamp: n/a Device Identifier: Vendor ID: Device ID: SubSys ID: Revision ID: Driver Strong Name: oem11.inf:IntelGfx.NTamd64.6.0:iSNBM0:8.15.10.2696:pci\ven_8086&dev_0126&subsys_04b61028 Rank Of Driver: 00E60001 Video Accel: Deinterlace Caps: n/a D3D9 Overlay: DXVA-HD: DDraw Status: Disabled D3D Status: Not Available AGP Status: Not Available ------------- Sound Devices ------------- Description: Speakers (High Definition Audio Device) Default Sound Playback: Yes Default Voice Playback: Yes Hardware ID: HDAUDIO\FUNC_01&VEN_10EC&DEV_0665&SUBSYS_102804B6&REV_1000 Manufacturer ID: 1 Product ID: 65535 Type: WDM Driver Name: HdAudio.sys Driver Version: 6.01.7601.17514 (English) Driver Attributes: Final Retail WHQL Logo'd: Yes Date and Size: 11/20/2010 22:23:47, 350208 bytes Other Files: Driver Provider: Microsoft HW Accel Level: Basic Cap Flags: 0xF1F Min/Max Sample Rate: 100, 200000 Static/Strm HW Mix Bufs: 1, 0 Static/Strm HW 3D Bufs: 0, 0 HW Memory: 0 Voice Management: No EAX(tm) 2.0 Listen/Src: No, No I3DL2(tm) Listen/Src: No, No Sensaura(tm) ZoomFX(tm): No Description: Digital Audio (S/PDIF) (High Definition Audio Device) Default Sound Playback: No Default Voice Playback: No Hardware ID: HDAUDIO\FUNC_01&VEN_10EC&DEV_0665&SUBSYS_102804B6&REV_1000 Manufacturer ID: 1 Product ID: 65535 Type: WDM Driver Name: HdAudio.sys Driver Version: 6.01.7601.17514 (English) Driver Attributes: Final Retail WHQL Logo'd: Yes Date and Size: 11/20/2010 22:23:47, 350208 bytes Other Files: Driver Provider: Microsoft HW Accel Level: Basic Cap Flags: 0xF1F Min/Max Sample Rate: 100, 200000 Static/Strm HW Mix Bufs: 1, 0 Static/Strm HW 3D Bufs: 0, 0 HW Memory: 0 Voice Management: No EAX(tm) 2.0 Listen/Src: No, No I3DL2(tm) Listen/Src: No, No Sensaura(tm) ZoomFX(tm): No --------------------- Sound Capture Devices --------------------- Description: Microphone (High Definition Audio Device) Default Sound Capture: Yes Default Voice Capture: Yes Driver Name: HdAudio.sys Driver Version: 6.01.7601.17514 (English) Driver Attributes: Final Retail Date and Size: 11/20/2010 22:23:47, 350208 bytes Cap Flags: 0x1 Format Flags: 0xFFFFF ------------------- DirectInput Devices ------------------- Device Name: Mouse Attached: 1 Controller ID: n/a Vendor/Product ID: n/a FF Driver: n/a Device Name: Keyboard Attached: 1 Controller ID: n/a Vendor/Product ID: n/a FF Driver: n/a Poll w/ Interrupt: No ----------- USB Devices ----------- + USB Root Hub | Vendor/Product ID: 0x8086, 0x1C26 | Matching Device ID: usb\root_hub20 | Service: usbhub | +-+ Generic USB Hub | | Vendor/Product ID: 0x8087, 0x0024 | | Location: Port_#0001.Hub_#0002 | | Matching Device ID: usb\class_09 | | Service: usbhub ---------------- Gameport Devices ---------------- ------------ PS/2 Devices ------------ + Standard PS/2 Keyboard | Matching Device ID: *pnp0303 | Service: i8042prt | + Terminal Server Keyboard Driver | Matching Device ID: root\rdp_kbd | Upper Filters: kbdclass | Service: TermDD | + Synaptics PS/2 Port TouchPad | Matching Device ID: *dll04b6 | Upper Filters: SynTP | Service: i8042prt | + Terminal Server Mouse Driver | Matching Device ID: root\rdp_mou | Upper Filters: mouclass | Service: TermDD ------------------------ Disk & DVD/CD-ROM Drives ------------------------ Drive: C: Free Space: 26.2 GB Total Space: 122.0 GB File System: NTFS Model: M4-CT128M4SSD2 ATA Device Drive: D: Model: Optiarc DVDRWBD BC-5540H ATA Device Driver: c:\windows\system32\drivers\cdrom.sys, 6.01.7601.17514 (English), , 0 bytes -------------- System Devices -------------- Name: High Definition Audio Controller Device ID: PCI\VEN_8086&DEV_1C20&SUBSYS_04B61028&REV_05\3&11583659&0&D8 Driver: n/a Name: PCI standard host CPU bridge Device ID: PCI\VEN_8086&DEV_0104&SUBSYS_04B61028&REV_09\3&11583659&0&00 Driver: n/a Name: PCI standard PCI-to-PCI bridge Device ID: PCI\VEN_8086&DEV_1C1A&SUBSYS_04B61028&REV_B5\3&11583659&0&E5 Driver: n/a Name: PCI standard PCI-to-PCI bridge Device ID: PCI\VEN_8086&DEV_0101&SUBSYS_20108086&REV_09\3&11583659&0&08 Driver: n/a Name: PCI standard PCI-to-PCI bridge Device ID: PCI\VEN_8086&DEV_1C18&SUBSYS_04B61028&REV_B5\3&11583659&0&E4 Driver: n/a Name: Intel(R) Centrino(R) Advanced-N 6230 Device ID: PCI\VEN_8086&DEV_0091&SUBSYS_52218086&REV_34\4&2634DE8D&0&00E1 Driver: n/a Name: PCI standard ISA bridge Device ID: PCI\VEN_8086&DEV_1C4B&SUBSYS_04B61028&REV_05\3&11583659&0&F8 Driver: n/a Name: PCI standard PCI-to-PCI bridge Device ID: PCI\VEN_8086&DEV_1C16&SUBSYS_04B61028&REV_B5\3&11583659&0&E3 Driver: n/a Name: Realtek PCIe GBE Family Controller Device ID: PCI\VEN_10EC&DEV_8168&SUBSYS_04B61028&REV_06\4&109EAB2F&0&00E5 Driver: n/a Name: Intel(R) Management Engine Interface Device ID: PCI\VEN_8086&DEV_1C3A&SUBSYS_04B61028&REV_04\3&11583659&0&B0 Driver: n/a Name: PCI standard PCI-to-PCI bridge Device ID: PCI\VEN_8086&DEV_1C12&SUBSYS_04B61028&REV_B5\3&11583659&0&E1 Driver: n/a Name: NVIDIA GeForce GT 525M Device ID: PCI\VEN_10DE&DEV_0DF5&SUBSYS_04B61028&REV_A1\4&4DCA75F&0&0008 Driver: n/a Name: Standard Enhanced PCI to USB Host Controller Device ID: PCI\VEN_8086&DEV_1C2D&SUBSYS_04B61028&REV_05\3&11583659&0&D0 Driver: n/a Name: PCI standard PCI-to-PCI bridge Device ID: PCI\VEN_8086&DEV_1C10&SUBSYS_04B61028&REV_B5\3&11583659&0&E0 Driver: n/a Name: Standard Enhanced PCI to USB Host Controller Device ID: PCI\VEN_8086&DEV_1C26&SUBSYS_04B61028&REV_05\3&11583659&0&E8 Driver: n/a Name: Standard AHCI 1.0 Serial ATA Controller Device ID: PCI\VEN_8086&DEV_1C03&SUBSYS_04B61028&REV_05\3&11583659&0&FA Driver: n/a Name: SM Bus Controller Device ID: PCI\VEN_8086&DEV_1C22&SUBSYS_04B61028&REV_05\3&11583659&0&FB Driver: n/a Name: Intel(R) HD Graphics 3000 Device ID: PCI\VEN_8086&DEV_0126&SUBSYS_04B61028&REV_09\3&11583659&0&10 Driver: n/a Name: Renesas Electronics USB 3.0 Host Controller Device ID: PCI\VEN_1033&DEV_0194&SUBSYS_04B61028&REV_04\4&3494AC3A&0&00E3 Driver: n/a ------------------ DirectShow Filters ------------------ DirectShow Filters: WMAudio Decoder DMO,0x00800800,1,1,WMADMOD.DLL,6.01.7601.17514 WMAPro over S/PDIF DMO,0x00600800,1,1,WMADMOD.DLL,6.01.7601.17514 WMSpeech Decoder DMO,0x00600800,1,1,WMSPDMOD.DLL,6.01.7601.17514 MP3 Decoder DMO,0x00600800,1,1,mp3dmod.dll,6.01.7600.16385 Mpeg4s Decoder DMO,0x00800001,1,1,mp4sdecd.dll,6.01.7600.16385 WMV Screen decoder DMO,0x00600800,1,1,wmvsdecd.dll,6.01.7601.17514 WMVideo Decoder DMO,0x00800001,1,1,wmvdecod.dll,6.01.7601.17514 Mpeg43 Decoder DMO,0x00800001,1,1,mp43decd.dll,6.01.7600.16385 Mpeg4 Decoder DMO,0x00800001,1,1,mpg4decd.dll,6.01.7600.16385 DV Muxer,0x00400000,0,0,qdv.dll,6.06.7601.17514 Color Space Converter,0x00400001,1,1,quartz.dll,6.06.7601.17713 WM ASF Reader,0x00400000,0,0,qasf.dll,12.00.7601.17514 Screen Capture filter,0x00200000,0,1,wmpsrcwp.dll,12.00.7601.17514 AVI Splitter,0x00600000,1,1,quartz.dll,6.06.7601.17713 VGA 16 Color Ditherer,0x00400000,1,1,quartz.dll,6.06.7601.17713 SBE2MediaTypeProfile,0x00200000,0,0,sbe.dll,6.06.7601.17528 Microsoft DTV-DVD Video Decoder,0x005fffff,2,4,msmpeg2vdec.dll,6.01.7140.0000 AC3 Parser Filter,0x00600000,1,1,mpg2splt.ax,6.06.7601.17528 StreamBufferSink,0x00200000,0,0,sbe.dll,6.06.7601.17528 MJPEG Decompressor,0x00600000,1,1,quartz.dll,6.06.7601.17713 MPEG-I Stream Splitter,0x00600000,1,2,quartz.dll,6.06.7601.17713 SAMI (CC) Parser,0x00400000,1,1,quartz.dll,6.06.7601.17713 VBI Codec,0x00600000,1,4,VBICodec.ax,6.06.7601.17514 MPEG-2 Splitter,0x005fffff,1,0,mpg2splt.ax,6.06.7601.17528 Closed Captions Analysis Filter,0x00200000,2,5,cca.dll,6.06.7601.17514 SBE2FileScan,0x00200000,0,0,sbe.dll,6.06.7601.17528 Microsoft MPEG-2 Video Encoder,0x00200000,1,1,msmpeg2enc.dll,6.01.7601.17514 Internal Script Command Renderer,0x00800001,1,0,quartz.dll,6.06.7601.17713 MPEG Audio Decoder,0x03680001,1,1,quartz.dll,6.06.7601.17713 DV Splitter,0x00600000,1,2,qdv.dll,6.06.7601.17514 Video Mixing Renderer 9,0x00200000,1,0,quartz.dll,6.06.7601.17713 Microsoft MPEG-2 Encoder,0x00200000,2,1,msmpeg2enc.dll,6.01.7601.17514 ACM Wrapper,0x00600000,1,1,quartz.dll,6.06.7601.17713 Video Renderer,0x00800001,1,0,quartz.dll,6.06.7601.17713 MPEG-2 Video Stream Analyzer,0x00200000,0,0,sbe.dll,6.06.7601.17528 Line 21 Decoder,0x00600000,1,1,qdvd.dll,6.06.7601.17835 Video Port Manager,0x00600000,2,1,quartz.dll,6.06.7601.17713 Video Renderer,0x00400000,1,0,quartz.dll,6.06.7601.17713 VPS Decoder,0x00200000,0,0,WSTPager.ax,6.06.7601.17514 WM ASF Writer,0x00400000,0,0,qasf.dll,12.00.7601.17514 VBI Surface Allocator,0x00600000,1,1,vbisurf.ax,6.01.7601.17514 File writer,0x00200000,1,0,qcap.dll,6.06.7601.17514 iTV Data Sink,0x00600000,1,0,itvdata.dll,6.06.7601.17514 iTV Data Capture filter,0x00600000,1,1,itvdata.dll,6.06.7601.17514 DVD Navigator,0x00200000,0,3,qdvd.dll,6.06.7601.17835 Overlay Mixer2,0x00200000,1,1,qdvd.dll,6.06.7601.17835 AVI Draw,0x00600064,9,1,quartz.dll,6.06.7601.17713 RDP DShow Redirection Filter,0xffffffff,1,0,DShowRdpFilter.dll, Microsoft MPEG-2 Audio Encoder,0x00200000,1,1,msmpeg2enc.dll,6.01.7601.17514 WST Pager,0x00200000,1,1,WSTPager.ax,6.06.7601.17514 MPEG-2 Demultiplexer,0x00600000,1,1,mpg2splt.ax,6.06.7601.17528 DV Video Decoder,0x00800000,1,1,qdv.dll,6.06.7601.17514 SampleGrabber,0x00200000,1,1,qedit.dll,6.06.7601.17514 Null Renderer,0x00200000,1,0,qedit.dll,6.06.7601.17514 MPEG-2 Sections and Tables,0x005fffff,1,0,Mpeg2Data.ax,6.06.7601.17514 Microsoft AC3 Encoder,0x00200000,1,1,msac3enc.dll,6.01.7601.17514 StreamBufferSource,0x00200000,0,0,sbe.dll,6.06.7601.17528 Smart Tee,0x00200000,1,2,qcap.dll,6.06.7601.17514 Overlay Mixer,0x00200000,0,0,qdvd.dll,6.06.7601.17835 AVI Decompressor,0x00600000,1,1,quartz.dll,6.06.7601.17713 AVI/WAV File Source,0x00400000,0,2,quartz.dll,6.06.7601.17713 Wave Parser,0x00400000,1,1,quartz.dll,6.06.7601.17713 MIDI Parser,0x00400000,1,1,quartz.dll,6.06.7601.17713 Multi-file Parser,0x00400000,1,1,quartz.dll,6.06.7601.17713 File stream renderer,0x00400000,1,1,quartz.dll,6.06.7601.17713 Microsoft DTV-DVD Audio Decoder,0x005fffff,1,1,msmpeg2adec.dll,6.01.7140.0000 StreamBufferSink2,0x00200000,0,0,sbe.dll,6.06.7601.17528 AVI Mux,0x00200000,1,0,qcap.dll,6.06.7601.17514 Line 21 Decoder 2,0x00600002,1,1,quartz.dll,6.06.7601.17713 File Source (Async.),0x00400000,0,1,quartz.dll,6.06.7601.17713 File Source (URL),0x00400000,0,1,quartz.dll,6.06.7601.17713 Infinite Pin Tee Filter,0x00200000,1,1,qcap.dll,6.06.7601.17514 Enhanced Video Renderer,0x00200000,1,0,evr.dll,6.01.7601.17514 BDA MPEG2 Transport Information Filter,0x00200000,2,0,psisrndr.ax,6.06.7601.17669 MPEG Video Decoder,0x40000001,1,1,quartz.dll,6.06.7601.17713 WDM Streaming Tee/Splitter Devices: Tee/Sink-to-Sink Converter,0x00200000,1,1,ksproxy.ax,6.01.7601.17514 Video Compressors: WMVideo8 Encoder DMO,0x00600800,1,1,wmvxencd.dll,6.01.7600.16385 WMVideo9 Encoder DMO,0x00600800,1,1,wmvencod.dll,6.01.7600.16385 MSScreen 9 encoder DMO,0x00600800,1,1,wmvsencd.dll,6.01.7600.16385 DV Video Encoder,0x00200000,0,0,qdv.dll,6.06.7601.17514 MJPEG Compressor,0x00200000,0,0,quartz.dll,6.06.7601.17713 Cinepak Codec by Radius,0x00200000,1,1,qcap.dll,6.06.7601.17514 Intel IYUV codec,0x00200000,1,1,qcap.dll,6.06.7601.17514 Intel IYUV codec,0x00200000,1,1,qcap.dll,6.06.7601.17514 Microsoft RLE,0x00200000,1,1,qcap.dll,6.06.7601.17514 Microsoft Video 1,0x00200000,1,1,qcap.dll,6.06.7601.17514 Audio Compressors: WM Speech Encoder DMO,0x00600800,1,1,WMSPDMOE.DLL,6.01.7600.16385 WMAudio Encoder DMO,0x00600800,1,1,WMADMOE.DLL,6.01.7600.16385 IMA ADPCM,0x00200000,1,1,quartz.dll,6.06.7601.17713 PCM,0x00200000,1,1,quartz.dll,6.06.7601.17713 Microsoft ADPCM,0x00200000,1,1,quartz.dll,6.06.7601.17713 GSM 6.10,0x00200000,1,1,quartz.dll,6.06.7601.17713 CCITT A-Law,0x00200000,1,1,quartz.dll,6.06.7601.17713 CCITT u-Law,0x00200000,1,1,quartz.dll,6.06.7601.17713 MPEG Layer-3,0x00200000,1,1,quartz.dll,6.06.7601.17713 Audio Capture Sources: Microphone (High Definition Aud,0x00200000,0,0,qcap.dll,6.06.7601.17514 PBDA CP Filters: PBDA DTFilter,0x00600000,1,1,CPFilters.dll,6.06.7601.17528 PBDA ETFilter,0x00200000,0,0,CPFilters.dll,6.06.7601.17528 PBDA PTFilter,0x00200000,0,0,CPFilters.dll,6.06.7601.17528 Midi Renderers: Default MidiOut Device,0x00800000,1,0,quartz.dll,6.06.7601.17713 Microsoft GS Wavetable Synth,0x00200000,1,0,quartz.dll,6.06.7601.17713 WDM Streaming Capture Devices: HD Audio Microphone 2,0x00200000,1,1,ksproxy.ax,6.01.7601.17514 Integrated Webcam,0x00200000,1,2,ksproxy.ax,6.01.7601.17514 WDM Streaming Rendering Devices: HD Audio Headphone/Speakers,0x00200000,1,1,ksproxy.ax,6.01.7601.17514 HD Audio SPDIF out,0x00200000,1,1,ksproxy.ax,6.01.7601.17514 BDA Network Providers: Microsoft ATSC Network Provider,0x00200000,0,1,MSDvbNP.ax,6.06.7601.17514 Microsoft DVBC Network Provider,0x00200000,0,1,MSDvbNP.ax,6.06.7601.17514 Microsoft DVBS Network Provider,0x00200000,0,1,MSDvbNP.ax,6.06.7601.17514 Microsoft DVBT Network Provider,0x00200000,0,1,MSDvbNP.ax,6.06.7601.17514 Microsoft Network Provider,0x00200000,0,1,MSNP.ax,6.06.7601.17514 Video Capture Sources: Integrated Webcam,0x00200000,1,2,ksproxy.ax,6.01.7601.17514 Multi-Instance Capable VBI Codecs: VBI Codec,0x00600000,1,4,VBICodec.ax,6.06.7601.17514 BDA Transport Information Renderers: BDA MPEG2 Transport Information Filter,0x00600000,2,0,psisrndr.ax,6.06.7601.17669 MPEG-2 Sections and Tables,0x00600000,1,0,Mpeg2Data.ax,6.06.7601.17514 BDA CP/CA Filters: Decrypt/Tag,0x00600000,1,1,EncDec.dll,6.06.7601.17708 Encrypt/Tag,0x00200000,0,0,EncDec.dll,6.06.7601.17708 PTFilter,0x00200000,0,0,EncDec.dll,6.06.7601.17708 XDS Codec,0x00200000,0,0,EncDec.dll,6.06.7601.17708 WDM Streaming Communication Transforms: Tee/Sink-to-Sink Converter,0x00200000,1,1,ksproxy.ax,6.01.7601.17514 Audio Renderers: Speakers (High Definition Audio,0x00200000,1,0,quartz.dll,6.06.7601.17713 Default DirectSound Device,0x00800000,1,0,quartz.dll,6.06.7601.17713 Default WaveOut Device,0x00200000,1,0,quartz.dll,6.06.7601.17713 Digital Audio (S/PDIF) (High De,0x00200000,1,0,quartz.dll,6.06.7601.17713 DirectSound: Digital Audio (S/PDIF) (High Definition Audio Device),0x00200000,1,0,quartz.dll,6.06.7601.17713 DirectSound: Speakers (High Definition Audio Device),0x00200000,1,0,quartz.dll,6.06.7601.17713 --------------- EVR Power Information --------------- Current Setting: {651288E5-A7ED-4076-A96B-6CC62D848FE1} (Balanced) Quality Flags: 2576 Enabled: Force throttling Allow half deinterlace Allow scaling Decode Power Usage: 100 Balanced Flags: 1424 Enabled: Force throttling Allow batching Force half deinterlace Force scaling Decode Power Usage: 50 PowerFlags: 1424 Enabled: Force throttling Allow batching Force half deinterlace Force scaling Decode Power Usage: 0

    Read the article

  • Oracle Data Integrator 11.1.1.5 Complex Files as Sources and Targets

    - by Alex Kotopoulis
    Overview ODI 11.1.1.5 adds the new Complex File technology for use with file sources and targets. The goal is to read or write file structures that are too complex to be parsed using the existing ODI File technology. This includes: Different record types in one list that use different parsing rules Hierarchical lists, for example customers with nested orders Parsing instructions in the file data, such as delimiter types, field lengths, type identifiers Complex headers such as multiple header lines or parseable information in header Skipping of lines  Conditional or choice fields Similar to the ODI File and XML File technologies, the complex file parsing is done through a JDBC driver that exposes the flat file as relational table structures. Complex files are mapped to one or more table structures, as opposed to the (simple) file technology, which always has a one-to-one relationship between file and table. The resulting set of tables follows the same concept as the ODI XML driver, table rows have additional PK-FK relationships to express hierarchy as well as order values to maintain the file order in the resulting table.   The parsing instruction format used for complex files is the nXSD (native XSD) format that is already in use with Oracle BPEL. This format extends the XML Schema standard by adding additional parsing instructions to each element. Using nXSD parsing technology, the native file is converted into an internal XML format. It is important to understand that the XML is streamed to improve performance; there is no size limitation of the native file based on memory size, the XML data is never fully materialized.  The internal XML is then converted to relational schema using the same mapping rules as the ODI XML driver. How to Create an nXSD file Complex file models depend on the nXSD schema for the given file. This nXSD file has to be created using a text editor or the Native Format Builder Wizard that is part of Oracle BPEL. BPEL is included in the ODI Suite, but not in standalone ODI Enterprise Edition. The nXSD format extends the standard XSD format through nxsd attributes. NXSD is a valid XML Schema, since the XSD standard allows extra attributes with their own namespaces. The following is a sample NXSD schema: <?xml version="1.0"?> <xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:nxsd="http://xmlns.oracle.com/pcbpel/nxsd" elementFormDefault="qualified" xmlns:tns="http://xmlns.oracle.com/pcbpel/demoSchema/csv" targetNamespace="http://xmlns.oracle.com/pcbpel/demoSchema/csv" attributeFormDefault="unqualified" nxsd:encoding="US-ASCII" nxsd:stream="chars" nxsd:version="NXSD"> <xsd:element name="Root">         <xsd:complexType><xsd:sequence>       <xsd:element name="Header">                 <xsd:complexType><xsd:sequence>                         <xsd:element name="Branch" type="xsd:string" nxsd:style="terminated" nxsd:terminatedBy=","/>                         <xsd:element name="ListDate" type="xsd:string" nxsd:style="terminated" nxsd:terminatedBy="${eol}"/>                         </xsd:sequence></xsd:complexType>                         </xsd:element>                 </xsd:sequence></xsd:complexType>         <xsd:element name="Customer" maxOccurs="unbounded">                 <xsd:complexType><xsd:sequence>                 <xsd:element name="Name" type="xsd:string" nxsd:style="terminated" nxsd:terminatedBy=","/>                         <xsd:element name="Street" type="xsd:string" nxsd:style="terminated" nxsd:terminatedBy="," />                         <xsd:element name="City" type="xsd:string" nxsd:style="terminated" nxsd:terminatedBy="${eol}" />                         </xsd:sequence></xsd:complexType>                         </xsd:element>                 </xsd:sequence></xsd:complexType> </xsd:element> </xsd:schema> The nXSD schema annotates elements to describe their position and delimiters within the flat text file. The schema above uses almost exclusively the nxsd:terminatedBy instruction to look for the next terminator chars. There are various constructs in nXSD to parse fixed length fields, look ahead in the document for string occurences, perform conditional logic, use variables to remember state, and many more. nXSD files can either be written manually using an XML Schema Editor or created using the Native Format Builder Wizard. Both Native Format Builder Wizard as well as the nXSD language are described in the Application Server Adapter Users Guide. The way to start the Native Format Builder in BPEL is to create a new File Adapter; in step 8 of the Adapter Configuration Wizard a new Schema for Native Format can be created:   The Native Format Builder guides through a number of steps to generate the nXSD based on a sample native file. If the format is complex, it is often a good idea to “approximate” it with a similar simple format and then add the complex components manually.  The resulting *.xsd file can be copied and used as the format for ODI, other BPEL constructs such as the file adapter definition are not relevant for ODI. Using this technique it is also possible to parse the same file format in SOA Suite and ODI, for example using SOA for small real-time messages, and ODI for large batches. This nXSD schema in this example describes a file with a header row containing data and 3 string fields per row delimited by commas, for example: Redwood City Downtown Branch, 06/01/2011 Ebeneezer Scrooge, Sandy Lane, Atherton Tiny Tim, Winton Terrace, Menlo Park The ODI Complex File JDBC driver exposes the file structure through a set of relational tables with PK-FK relationships. The tables for this example are: Table ROOT (1 row): ROOTPK Primary Key for root element SNPSFILENAME Name of the file SNPSFILEPATH Path of the file SNPSLOADDATE Date of load Table HEADER (1 row): ROOTFK Foreign Key to ROOT record ROWORDER Order of row in native document BRANCH Data BRANCHORDER Order of Branch within row LISTDATE Data LISTDATEORDER Order of ListDate within row Table ADDRESS (2 rows): ROOTFK Foreign Key to ROOT record ROWORDER Order of row in native document NAME Data NAMEORDER Oder of Name within row STREET Data STREETORDER Order of Street within row CITY Data CITYORDER Order of City within row Every table has PK and/or FK fields to reflect the document hierarchy through relationships. In this example this is trivial since the HEADER and all CUSTOMER records point back to the PK of ROOT. Deeper nested documents require this to identify parent elements. All tables also have a ROWORDER field to define the order of rows, as well as order fields for each column, in case the order of columns varies in the original document and needs to be maintained. If order is not relevant, these fields can be ignored. How to Create an Complex File Data Server in ODI After creating the nXSD file and a test data file, and storing it on the local file system accessible to ODI, you can go to the ODI Topology Navigator to create a Data Server and Physical Schema under the Complex File technology. This technology follows the conventions of other ODI technologies and is very similar to the XML technology. The parsing settings such as the source native file, the nXSD schema file, the root element, as well as the external database can be set in the JDBC URL: The use of an external database defined by dbprops is optional, but is strongly recommended for production use. Ideally, the staging database should be used for this. Also, when using a complex file exclusively for read purposes, it is recommended to use the ro=true property to ensure the file is not unnecessarily synchronized back from the database when the connection is closed. A data file is always required to be present  at the filename path during design-time. Without this file, operations like testing the connection, reading the model data, or reverse engineering the model will fail.  All properties of the Complex File JDBC Driver are documented in the Oracle Fusion Middleware Connectivity and Knowledge Modules Guide for Oracle Data Integrator in Appendix C: Oracle Data Integrator Driver for Complex Files Reference. David Allan has created a great viewlet Complex File Processing - 0 to 60 which shows the creation of a Complex File data server as well as a model based on this server. How to Create Models based on an Complex File Schema Once physical schema and logical schema have been created, the Complex File can be used to create a Model as if it were based on a database. When reverse-engineering the Model, data stores(tables) for each XSD element of complex type will be created. Use of complex files as sources is straightforward; when using them as targets it has to be made sure that all dependent tables have matching PK-FK pairs; the same applies to the XML driver as well. Debugging and Error Handling There are different ways to test an nXSD file. The Native Format Builder Wizard can be used even if the nXSD wasn’t created in it; it will show issues related to the schema and/or test data. In ODI, the nXSD  will be parsed and run against the existing test XML file when testing a connection in the Dataserver. If either the nXSD has an error or the data is non-compliant to the schema, an error will be displayed. Sample error message: Error while reading native data. [Line=1, Col=5] Not enough data available in the input, when trying to read data of length "19" for "element with name D1" from the specified position, using "style" as "fixedLength" and "length" as "". Ensure that there is enough data from the specified position in the input. Complex File FAQ Is the size of the native file limited by available memory? No, since the native data is streamed through the driver, only the available space in the staging database limits the size of the data. There are limits on individual field sizes, though; a single large object field needs to fit in memory. Should I always use the complex file driver instead of the file driver in ODI now? No, use the file technology for all simple file parsing tasks, for example any fixed-length or delimited files that just have one row format and can be mapped into a simple table. Because of its narrow assumptions the ODI file driver is easy to configure within ODI and can stream file data without writing it into a database. The complex file driver should be used whenever the use case cannot be handled through the file driver. Are we generating XML out of flat files before we write it into a database? We don’t materialize any XML as part of parsing a flat file, either in memory or on disk. The data produced by the XML parser is streamed in Java objects that just use XSD-derived nXSD schema as its type system. We use the nXSD schema because is the standard for describing complex flat file metadata in Oracle Fusion Middleware, and enables users to share schemas across products. Is the nXSD file interchangeable with SOA Suite? Yes, ODI can use the same nXSD files as SOA Suite, allowing mixed use cases with the same data format. Can I start the Native Format Builder from the ODI Studio? No, the Native Format Builder has to be started from a JDeveloper with BPEL instance. You can get BPEL as part of the SOA Suite bundle. Users without SOA Suite can manually develop nXSD files using XSD editors. When is the database data written back to the native file? Data is synchronized using the SYNCHRONIZE and CREATE FILE commands, and when the JDBC connection is closed. It is recommended to set the ro or read_only property to true when a file is exclusively used for reading so that no unnecessary write-backs occur. Is the nXSD metadata part of the ODI Master or Work Repository? No, the data server definition in the master repository only contains the JDBC URL with file paths; the nXSD files have to be accessible on the file systems where the JDBC driver is executed during production, either by copying or by using a network file system. Where can I find sample nXSD files? The Application Server Adapter Users Guide contains nXSD samples for various different use cases.

    Read the article

  • The case of the phantom ADF developer (and other yarns)

    - by Chris Muir
    A few years of ADF experience means I see common mistakes made by different developers, some I regularly make myself.  This post is designed to assist beginners to Oracle JDeveloper Application Development Framework (ADF) avoid a common ADF pitfall, the case of the phantom ADF developer [add Scooby-Doo music here]. ADF Business Components - triggers, default table values and instead of views. Oracle's JDeveloper tutorials help with the A-B-Cs of ADF development, typically built on the nice 'n safe demo schema provided by with the Oracle database such as the HR demo schema. However it's not too long until ADF beginners, having built up some confidence from learning with the tutorials and vanilla demo schemas, start building ADF Business Components based upon their own existing database schema objects.  This is where unexpected problems can sneak in. The crime Developers may encounter a surprising error at runtime when editing a record they just created or updated and committed to the database, based on their own existing tables, namely the error: JBO-25014: Another user has changed the row with primary key oracle.jbo.Key[x] ...where X is the primary key value of the row at hand.  In a production environment with multiple users this error may be legit, one of the other users has updated the row since you queried it.  Yet in a development environment this error is just plain confusing.  If developers are isolated in their own database, creating and editing records they know other users can't possibly be working with, or all the other developers have gone home for the day, how is this error possible? There are no other users?  It must be the phantom ADF developer! [insert dramatic music here] The following picture is what you'll see in the Business Component Browser, and you'll receive a similar error message via an ADF Faces page: A false conclusion What can possibly cause this issue if it isn't our phantom ADF developer?  Doesn't ADF BC implement record locking, locking database records when the row is modified in the ADF middle-tier by a user?  How can our phantom ADF developer even take out a lock if this is the case?  Maybe ADF has a bug, maybe ADF isn't implementing record locking at all?  Shouldn't we see the error "JBO-26030: Failed to lock the record, another user holds the lock" as we attempt to modify the record, why do we see JBO-25014? : Let's verify that ADF is in fact issuing the correct SQL LOCK-FOR-UPDATE statement to the database. First we need to verify ADF's locking strategy.  It is determined by the Application Module's jbo.locking.mode property.  The default (as of JDev 11.1.1.4.0 if memory serves me correct) and recommended value is optimistic, and the other valid value is pessimistic. Next we need a mechanism to check that ADF is issuing the LOCK statements to the database.  We could ask DBAs to monitor locks with OEM, but optimally we'd rather not involve overworked DBAs in this process, so instead we can use the ADF runtime setting –Djbo.debugoutput=console.  At runtime this options turns on instrumentation within the ADF BC layer, which among a lot of extra detail displayed in the log window, will show the actual SQL statement issued to the database, including the LOCK statement we're looking to confirm. Setting our locking mode to pessimistic, opening the Business Components Browser of a JSF page allowing us to edit a record, say the CHARGEABLE field within a BOOKINGS record where BOOKING_NO = 1206, upon editing the record see among others the following log entries: [421] Built select: 'SELECT BOOKING_NO, EVENT_NO, RESOURCE_CODE, CHARGEABLE, MADE_BY, QUANTITY, COST, STATUS, COMMENTS FROM BOOKINGS Bookings'[422] Executing LOCK...SELECT BOOKING_NO, EVENT_NO, RESOURCE_CODE, CHARGEABLE, MADE_BY, QUANTITY, COST, STATUS, COMMENTS FROM BOOKINGS Bookings WHERE BOOKING_NO=:1 FOR UPDATE NOWAIT[423] Where binding param 1: 1206  As can be seen on line 422, in fact a LOCK-FOR-UPDATE is indeed issued to the database.  Later when we commit the record we see: [441] OracleSQLBuilder: SAVEPOINT 'BO_SP'[442] OracleSQLBuilder Executing, Lock 1 DML on: BOOKINGS (Update)[443] UPDATE buf Bookings>#u SQLStmtBufLen: 210, actual=62[444] UPDATE BOOKINGS Bookings SET CHARGEABLE=:1 WHERE BOOKING_NO=:2[445] Update binding param 1: N[446] Where binding param 2: 1206[447] BookingsView1 notify COMMIT ... [448] _LOCAL_VIEW_USAGE_model_Bookings_ResourceTypesView1 notify COMMIT ... [449] EntityCache close prepared statement ....and as a result the changes are saved to the database, and the lock is released. Let's see what happens when we use the optimistic locking mode, this time to change the same BOOKINGS record CHARGEABLE column again.  As soon as we edit the record we see little activity in the logs, nothing to indicate any SQL statement, let alone a LOCK has been taken out on the row. However when we save our records by issuing a commit, the following is recorded in the logs: [509] OracleSQLBuilder: SAVEPOINT 'BO_SP'[510] OracleSQLBuilder Executing doEntitySelect on: BOOKINGS (true)[511] Built select: 'SELECT BOOKING_NO, EVENT_NO, RESOURCE_CODE, CHARGEABLE, MADE_BY, QUANTITY, COST, STATUS, COMMENTS FROM BOOKINGS Bookings'[512] Executing LOCK...SELECT BOOKING_NO, EVENT_NO, RESOURCE_CODE, CHARGEABLE, MADE_BY, QUANTITY, COST, STATUS, COMMENTS FROM BOOKINGS Bookings WHERE BOOKING_NO=:1 FOR UPDATE NOWAIT[513] Where binding param 1: 1205[514] OracleSQLBuilder Executing, Lock 2 DML on: BOOKINGS (Update)[515] UPDATE buf Bookings>#u SQLStmtBufLen: 210, actual=62[516] UPDATE BOOKINGS Bookings SET CHARGEABLE=:1 WHERE BOOKING_NO=:2[517] Update binding param 1: Y[518] Where binding param 2: 1205[519] BookingsView1 notify COMMIT ... [520] _LOCAL_VIEW_USAGE_model_Bookings_ResourceTypesView1 notify COMMIT ... [521] EntityCache close prepared statement Again even though we're seeing the midtier delay the LOCK statement until commit time, it is in fact occurring on line 412, and released as part of the commit issued on line 419.  Therefore with either optimistic or pessimistic locking a lock is indeed issued. Our conclusion at this point must be, unless there's the unlikely cause the LOCK statement is never really hitting the database, or the even less likely cause the database has a bug, then ADF does in fact take out a lock on the record before allowing the current user to update it.  So there's no way our phantom ADF developer could even modify the record if he tried without at least someone receiving a lock error. Hmm, we can only conclude the locking mode is a red herring and not the true cause of our problem.  Who is the phantom? At this point we'll need to conclude that the error message "JBO-25014: Another user has changed" is somehow legit, even though we don't understand yet what's causing it. This leads onto two further questions, how does ADF know another user has changed the row, and what's been changed anyway? To answer the first question, how does ADF know another user has changed the row, the Fusion Guide's section 4.10.11 How to Protect Against Losing Simultaneous Updated Data , that details the Entity Object Change-Indicator property, gives us the answer: At runtime the framework provides automatic "lost update" detection for entity objects to ensure that a user cannot unknowingly modify data that another user has updated and committed in the meantime. Typically, this check is performed by comparing the original values of each persistent entity attribute against the corresponding current column values in the database at the time the underlying row is locked. Before updating a row, the entity object verifies that the row to be updated is still consistent with the current state of the database.  The guide further suggests to make this solution more efficient: You can make the lost update detection more efficient by identifying any attributes of your entity whose values you know will be updated whenever the entity is modified. Typical candidates include a version number column or an updated date column in the row.....To detect whether the row has been modified since the user queried it in the most efficient way, select the Change Indicator option to compare only the change-indicator attribute values. We now know that ADF BC doesn't use the locking mechanism at all to protect the current user against updates, but rather it keeps a copy of the original record fetched, separate to the user changed version of the record, and it compares the original record against the one in the database when the lock is taken out.  If values don't match, be it the default compare-all-columns behaviour, or the more efficient Change Indicator mechanism, ADF BC will throw the JBO-25014 error. This leaves one last question.  Now we know the mechanism under which ADF identifies a changed row, what we don't know is what's changed and who changed it? The real culprit What's changed?  We know the record in the mid-tier has been changed by the user, however ADF doesn't use the changed record in the mid-tier to compare to the database record, but rather a copy of the original record before it was changed.  This leaves us to conclude the database record has changed, but how and by who? There are three potential causes: Database triggers The database trigger among other uses, can be configured to fire PLSQL code on a database table insert, update or delete.  In particular in an insert or update the trigger can override the value assigned to a particular column.  The trigger execution is actioned by the database on behalf of the user initiating the insert or update action. Why this causes the issue specific to our ADF use, is when we insert or update a record in the database via ADF, ADF keeps a copy of the record written to the database.  However the cached record is instantly out of date as the database triggers have modified the record that was actually written to the database.  Thus when we update the record we just inserted or updated for a second time to the database, ADF compares its original copy of the record to that in the database, and it detects the record has been changed – giving us JBO-25014. This is probably the most common cause of this problem. Default values A second reason this issue can occur is another database feature, default column values.  When creating a database table the schema designer can define default values for specific columns.  For example a CREATED_BY column could be set to SYSDATE, or a flag column to Y or N.  Default values are only used by the database when a user inserts a new record and the specific column is assigned NULL.  The database in this case will overwrite the column with the default value. As per the database trigger section, it then becomes apparent why ADF chokes on this feature, though it can only specifically occur in an insert-commit-update-commit scenario, not the update-commit-update-commit scenario. Instead of trigger views I must admit I haven't double checked this scenario but it seems plausible, that of the Oracle database's instead of trigger view (sometimes referred to as instead of views).  A view in the database is based on a query, and dependent on the queries complexity, may support insert, update and delete functionality to a limited degree.  In order to support fully insertable, updateable and deletable views, Oracle introduced the instead of view, that gives the view designer the ability to not only define the view query, but a set of programmatic PLSQL triggers where the developer can define their own logic for inserts, updates and deletes. While this provides the database programmer a very powerful feature, it can cause issues for our ADF application.  On inserting or updating a record in the instead of view, the record and it's data that goes in is not necessarily the data that comes out when ADF compares the records, as the view developer has the option to practically do anything with the incoming data, including throwing it away or pushing it to tables which aren't used by the view underlying query for fetching the data. Readers are at this point reminded that this article is specifically about how the JBO-25014 error occurs in the context of 1 developer on an isolated database.  The article is not considering how the error occurs in a production environment where there are multiple users who can cause this error in a legitimate fashion.  Assuming none of the above features are the cause of the problem, and optimistic locking is turned on (this error is not possible if pessimistic locking is the default mode *and* none of the previous causes are possible), JBO-25014 is quite feasible in a production ADF application if 2 users modify the same record. At this point under project timelines pressure, the obvious fix for developers is to drop both database triggers and default values from the underlying tables.  However we must be careful that these legacy constructs aren't used and assumed to be in place by other legacy systems.  Dropping the database triggers or default value that the existing Oracle Forms  applications assumes and requires to be in place could cause unexpected behaviour and bugs in the Forms application.  Proficient software engineers would recognize such a change may require a partial or full regression test of the existing legacy system, a potentially costly and timely exercise, not ideal. Solving the mystery once and for all Luckily ADF has built in functionality to deal with this issue, though it's not a surprise, as Oracle as the author of ADF also built the database, and are fully aware of the Oracle database's feature set.  At the Entity Object attribute level, the Refresh After Insert and Refresh After Update properties.  Simply selecting these instructs ADF BC after inserting or updating a record to the database, to expect the database to modify the said attributes, and read a copy of the changed attributes back into its cached mid-tier record.  Thus next time the developer modifies the current record, the comparison between the mid-tier record and the database record match, and JBO-25014: Another user has changed" is no longer an issue. [Post edit - as per the comment from Oracle's Steven Davelaar below, as he correctly points out the above solution will not work for instead-of-triggers views as it relies on SQL RETURNING clause which is incompatible with this type of view] Alternatively you can set the Change Indicator on one of the attributes.  This will work as long as the relating column for the attribute in the database itself isn't inadvertently updated.  In turn you're possibly just masking the issue rather than solving it, because if another developer turns the Change Indicator back on the original issue will return.

    Read the article

  • CodePlex Daily Summary for Wednesday, November 23, 2011

    CodePlex Daily Summary for Wednesday, November 23, 2011Popular ReleasesVisual Leak Detector for Visual C++ 2008/2010: v2.2.1: Enhancements: * strdup and _wcsdup functions support added. * Preliminary support for VS 11 added. Bugs Fixed: * Low performance after upgrading from VLD v2.1. * Memory leaks with static linking fixed (disabled calloc support). * Runtime error R6002 fixed because of wrong memory dump format. * version.h fixed in installer. * Some PVS studio warning fixed.NetSqlAzMan - .NET SQL Authorization Manager: 3.6.0.10: 3.6.0.10 22-Nov-2011 Update: Removed PreEmptive Platform integration (PreEmptive analytics) Removed all PreEmptive attributes Removed PreEmptive.dll assembly references from all projects Added first support to ADAM/AD LDS Thanks to PatBea. Work Item 9775: http://netsqlazman.codeplex.com/workitem/9775Developer Team Article System Management: DTASM v1.3: ?? ??? ???? 3 ????? ???? ???? ????? ??? : - ????? ?????? ????? ???? ?? ??? ???? ????? ?? ??? ? ?? ???? ?????? ???? ?? ???? ????? ?? . - ??? ?? ???? ????? ???? ????? ???? ???? ?? ????? , ?????? ????? ????? ?? ??? . - ??? ??????? ??? ??? ???? ?? ????? ????? ????? .SharePoint 2010 FBA Pack: SharePoint 2010 FBA Pack 1.2.0: Web parts are now fully customizable via html templates (Issue #323) FBA Pack is now completely localizable using resource files. Thank you David Chen for submitting the code as well as Chinese translations of the FBA Pack! The membership request web part now gives the option of having the user enter the password and removing the captcha (Issue # 447) The FBA Pack will now work in a zone that does not have FBA enabled (Another zone must have FBA enabled, and the zone must contain the me...SharePoint 2010 Education Demo Project: Release SharePoint SP1 for Education Solutions: This release includes updates to the Content Packs for SharePoint SP1. All Content Packs have been updated to install successfully under SharePoint SP1SQL Monitor - tracking sql server activities: SQLMon 4.1 alpha 6: 1. improved support for schema 2. added find reference when right click on object list 3. added object rename supportBugNET Issue Tracker: BugNET 0.9.126: First stable release of version 0.9. Upgrades from 0.8 are fully supported and upgrades to future releases will also be supported. This release is now compiled against the .NET 4.0 framework and is a requirement. Because of this the web.config has significantly changed. After upgrading, you will need to configure the authentication settings for user registration and anonymous access again. Please see our installation / upgrade instructions for more details: http://wiki.bugnetproject.c...Anno 2070 Assistant: v0.1.0 (STABLE): Version 0.1.0 Features Production Chains Eco Production Chains (Complete) Tycoon Production Chains (Disabled - Incomplete) Tech Production Chains (Disabled - Incomplete) Supply (Disabled - Incomplete) Calculator (Disabled - Incomplete) Building Layouts Eco Building Layouts (Complete) Tycoon Building Layouts (Disabled - Incomplete) Tech Building Layouts (Disabled - Incomplete) Credits (Complete)Free SharePoint 2010 Sites Templates: SharePoint Server 2010 Sites Templates: here is the list of sites templates to be downloadedVsTortoise - a TortoiseSVN add-in for Microsoft Visual Studio: VsTortoise Build 30 Beta: Note: This release does not work with custom VsTortoise toolbars. These get removed every time when you shutdown Visual Studio. (#7940) Build 30 (beta)New: Support for TortoiseSVN 1.7 added. (the download contains both setups, for TortoiseSVN 1.6 and 1.7) New: OpenModifiedDocumentDialog displays conflicted files now. New: OpenModifiedDocument allows to group items by changelist now. Fix: OpenModifiedDocumentDialog caused Visual Studio 2010 to freeze sometimes. Fix: The installer didn...nopCommerce. Open source shopping cart (ASP.NET MVC): nopcommerce 2.30: Highlight features & improvements: • Performance optimization. • Back in stock notifications. • Product special price support. • Catalog mode (based on customer role) To see the full list of fixes and changes please visit the release notes page (http://www.nopCommerce.com/releasenotes.aspx).WPF Converters: WPF Converters V1.2.0.0: support for enumerations, value types, and reference types in the expression converter's equality operators the expression converter now handles DependencyProperty.UnsetValue as argument values correctly (#4062) StyleCop conformance (more or less)Json.NET: Json.NET 4.0 Release 4: Change - JsonTextReader.Culture is now CultureInfo.InvariantCulture by default Change - KeyValurPairConverter no longer cares about the order of the key and value properties Change - Time zone conversions now use new TimeZoneInfo instead of TimeZone Fix - Fixed boolean values sometimes being capitalized when converting to XML Fix - Fixed error when deserializing ConcurrentDictionary Fix - Fixed serializing some Uris returning the incorrect value Fix - Fixed occasional error when...Media Companion: MC 3.423b Weekly: Ensure .NET 4.0 Full Framework is installed. (Available from http://www.microsoft.com/download/en/details.aspx?id=17718) Ensure the NFO ID fix is applied when transitioning from versions prior to 3.416b. (Details here) Replaced 'Rebuild' with 'Refresh' throughout entire code. Rebuild will now be known as Refresh. mc_com.exe has been fully updated TV Show Resolutions... Resolved issue #206 - having to hit save twice when updating runtime manually Shrunk cache size and lowered loading times f...Delta Engine: Delta Engine Beta Preview v0.9.1: v0.9.1 beta release with lots of refactoring, fixes, new samples and support for iOS, Android and WP7 (you need a Marketplace account however). If you want a binary release for the games (like v0.9.0), just say so in the Forum or here and we will quickly prepare one. It is just not much different from v0.9.0, so I left it out this time. See http://DeltaEngine.net/Wiki.Roadmap for details.ASP.net Awesome Samples (Web-Forms): 1.0 samples: Demos and Tutorials for ASP.net Awesome VS2008 are in .NET 3.5 VS2010 are in .NET 4.0 (demos for the ASP.net Awesome jQuery Ajax Controls)SharpMap - Geospatial Application Framework for the CLR: SharpMap-0.9-AnyCPU-Trunk-2011.11.17: This is a build of SharpMap from the 0.9 development trunk as per 2011-11-17 For most applications the AnyCPU release is the recommended, but in case you need an x86 build that is included to. For some dataproviders (GDAL/OGR, SqLite, PostGis) you need to also referense the SharpMap.Extensions assembly For SqlServer Spatial you need to reference the SharpMap.SqlServerSpatial assemblyAJAX Control Toolkit: November 2011 Release: AJAX Control Toolkit Release Notes - November 2011 Release Version 51116November 2011 release of the AJAX Control Toolkit. AJAX Control Toolkit .NET 4 - Binary – AJAX Control Toolkit for .NET 4 and sample site (Recommended). AJAX Control Toolkit .NET 3.5 - Binary – AJAX Control Toolkit for .NET 3.5 and sample site (Recommended). Notes: - The current version of the AJAX Control Toolkit is not compatible with ASP.NET 2.0. The latest version that is compatible with ASP.NET 2.0 can be found h...Microsoft Ajax Minifier: Microsoft Ajax Minifier 4.36: Fix for issue #16908: string literals containing ASP.NET replacement syntax fail if the ASP.NET code contains the same character as the string literal delimiter. Also, we shouldn't be changing the delimiter for those literals or combining them with other literals; the developer may have specifically chosen the delimiter used because of possible content inserted by ASP.NET code. This logic is normally off; turn it on via the -aspnet command-line flag (or the Code.Settings.AllowEmbeddedAspNetBl...MVC Controls Toolkit: Mvc Controls Toolkit 1.5.5: Added: Now the DateRanteAttribute accepts complex expressions containing "Now" and "Today" as static minimum and maximum. Menu, MenuFor helpers capable of handling a "currently selected element". The developer can choose between using a standard nested menu based on a standard SimpleMenuItem class or specifying an item template based on a custom class. Added also helpers to build the tree structure containing all data items the menu takes infos from. Improved the pager. Now the developer ...New ProjectsActiveWorlds World Server Admin PowerShell SnapIn: The purpose of this PowerShell SnapIn is to provide a set of tools to administer the world server from PowerShell. It leverages the ActiveWorlds SDK .NET Wrapper to provide this functionality.Aigu: Enter special characters like you would on your mobile phone. For instance, if you want to type 'é', you just hold down 'e' and a menu will appear. Selected the desired character using the arrow keys and press 'enter'. Simple but powerful.Are you workaholic?: Are you a workaholic? Did your Doctor advice you not to stare at the computer monitor for a long time? Then this app is perfectly made for you. It runs in the background, and alerts you to take periodic rests for your eyes and body. What's more, It's open source (MS-PL).ATDIS PoC: privateAuto Version Web Assets: The AVWA project is an HTTP Module written in C# that is designed to allow for versioning of various web assets such as .CSS and .JS files. This allows you to publish new versions of these files without having to force the server or the client browsers to expire cache.Bachelor Thesis Algorithm Test Bed: Algorithm Test Bed for my Bachelor ThesisBase64: Simple application helps converting strings and files from or to Base64 string. You can use any encoding to convert while a sidebar previews decoded string for all other encodings.BoracayExpress: BoracayExpressC++ Framework for Test Driven Development: A testing framework for C++ written in C++.Class2Table: Class2Table aka Entity2Table. Easy tool that allows creation of SQL tables from .Net types.Code for Demos & Experiments: This is where I will post code from demos and presentationsCodeMaker: CodeMaker?????????: 1、?????????? 2、???? 3、????? 4、??Python????????? ConsoleCommand: ConsoleCommand provides certain .Net commands for access from javascript console engines. Included are commands to set the text and background colors, as well as list and extract resources compiled in a .Net dll. Converter: Character code conversion tools ???????? CryptoInator - self contained, self-encrypting, self-decrypting image viewer: Original developed to encrypt and store NemID images in Denmark. DAiBears: Something, something, botDelicious Notify Plugin: Lets you push a blog post straight to Delicious from Live WriterDeveloperFile: Compresses Javascripts using the YUI .NET project. Loops through the root folder and subfolders for files matching the debug extension and creates new files using the release extension. (File extensions must match exactly).DotNetNuke SharePoint File Explorer: A DotNetNuke SharePoint File ExplorerDouban FM: WP7 Douban FM appGame Lib: Game Library is a open-source game library to allow focusing on the fun part of a game. It is developed in C#, but will be ported to C++ and VB.net.Google reader notes to Delicious Export tool (WPF): Google reader discontinued note in reader features. Current google reader allows to export users old notes in JSON format, This App will parse the JSON file & upload it to it delicious , delicious is a good alternative for note in readerHtml Source Transmitter Control: This web control allows getting a source of a web page, that will displayed before submit. So, developer can store a view of the html page, that was before server exception. It helps to reproduce bugs and can be used with other logging systems.Ideopuzzle: A puzzle gameImageShack-Uploader: This project demonstrates how to upload files automated to imageshack.us and other image hosters with C#.Insert Acronym Tags: Lets you insert <acronym> and <abbr> tags into your blog entry more easily.Insert Quick Link: Allows you to paste a link into the Writer window and have the a window similar to the one in Writer where you can change what text is to appear, open in new window, etc.Insert Video Plugin: Allows you to insert a video into a blog entry from a multitude of different sitesIoCWrap: Provides a wrapper to the various IoC container implementations so that it is possible to switch to a different provider without changing any application code.kaveepoj: sharepoint projectKinect Quiz Engine: Fun quiz game for the Kinect.Klaverjas: Test application for testing different new technologies in .NET (WCF, DataServices, C# stuff, Entity...etc.)Man In The Middle: A cyberpunk themed action with puzzle and strategy elements. Made with XNA as part of a game development course at the IT University of Copenhagen by Bo Bendtsen, Jonas Flensbak, Daniel Kromand, Jess Rahbek & Darryl Woodford.MediaSelektor: Simple tool to select mediasMicajah Mindtouch Deki Wiki Copier: Small C# application to move data between 2 Deki Wiki installs or, more importantly, from a wik.is account to a locally installed systemMineFlagger: MineFlagger is a mine clearing game modeled after Microsoft’s Minesweeper. In addition to standard play, MineFlagger incorporates an AI for fun and training.myXbyqwrhjadsfasfhgf: myXbyqwrhjadsfasfhgfnatoop: natoopNauplius.KeyStore: Provides secure application key storage backed by SQL 2008 and Active Directory.ObjectDB: An object database written using C# 4 and Mono.Cecil.PaceR: PaceR is an attempt to encapsulate a lot of the common code functionality I use on different projects. Instead of recreating functionality from memory or worse, copying from older projects, I'd like to have a central location to maintain this common code. Parseq: Parseq is a Parser Combinator library written in C# (version 2.0).PowerShell Network Adapter Configuration module: PowerShell Network Adapter Configuration module is a PowerShell module which provides functions for managing network adapters using WMI.public traffic tracker: This is a university project for a .net course. We develop a public traffic tracker applications for Windows Phone 7 devices, that can give information about the actual positions of the nearest vehicle on a given line. The speciality is that we use only the GPS information of the users' WP7 devices, so this is a completely software solution without any hardware investment. The disatvantage is that for the real operation we would need a lot of active WP7 user.puyo: puyoRadioTroll: Projeto web Radio TrollRead Feed Community: Read Feed CommunityReviewer: Reviewer.dk - Dansk spil og anmeldelsessite.Rollout Sharepoint Solutions - ROSS: ROSS performs the following actions: - Delete sitecollection and restart services - 'Get Latest Version' from SourceSafe - Rebuild Solution - Install all wsp solutions - Create SiteCollections - Check for build en provisioning errors - Send email to developers if errors occurredSchool Management: school managementSQL File Executer: This project is a class library written in c# which is used for executing *.sql files in remote server. Simply one dll file. You include it in your web project, add using statement at the top of your page, pass the parameters inside. Rest, it will do.Startup Manager: Startup Manager launches all startup programs at a managed rate therefore meaning that your computer doesn't crash everytime it starts up and you can use it immediately.stetic: ...Test Infrastructure Guidance: The purpose of this project is to provide guidance to testers in using TFS effectively as an ALM solution. TFS is much more than a simple code repository. Used with Visual Studio it can form a powerful testing solution and remove a lot of pain in dealing with test infrastructure overhead.Tête-à-tête: Tete-a-tete is an address book with a built-in function to send electronic mail over the Internet.Tipeysh! - Add-in that helps you creating C/C++ header files on a single click: Are you also feel miserable when you need to create a new header file in your Visual Studio C/C++ project? Repeatedly choosing "new header file", then writing the annoying (but needed) "#ifndef" section, then writing the class name with it's "private", "protected" and "public" access modifiers... too much clicks and typewriting! Well, there is a solution: Tipeysh! is a simple, easy to use, very handy and configurable Visual Studio Add-In, compatible for both the 2005 and 2008 versions. Once ...UMN Dashboard Project: academic projUsersMOSS: UsersMOSS est une petite application permettant de consulter sur un serveur MOSS les sites web (SPWeb) les users (SPUser), et les groupes (SPGroup). Cette application utilise le modèle objet de MOSS pour inspecter le contenu des objets d'un serveur MOSS. Cette application est loin d'être professionnelle, ou même terminée, mais elle me rend très souvent service. Surtout ne l'utilisez pas sur un serveur de production car le gestion du GC n'est pas faite, ce qui peut provoquer des plantages de v...UtilityLibrary.Win32: UtilityLibrary.Win32UW iLearn: The iLearn activity inference platform is a suite of desktop and mobile tools for logging, modeling, and classifying sensor data for mobile devices. It was created at the University of Washington.VsDocGen: Dynamic javascript documentation generation directly from xml comment documented source code.Windows Live Spaces Photo Album plugin: This is going to be a plugin for Windows Live Writer that will allow you to browse a Windows Live Space Photo Album.Windows Live Writer Plugin for Amazon Books using CueCat: This Windows Live Writer Plugin is for users who use WLW and wish to use their CueCat to scan books. ItemLookups are run against Amazon via its AWS and book image, title, author, and publisher is returned. This project was first created by Scott Hanselman on MSDN's Coding4Fun! X7: X7 makes it easier for win7user to clean the system. You'll no longer have to delete useless stuff in your win7. It's developed in bat.xDT - Commander: Using this application, the user can assign shortcuts (short texts) for various links/URLs. These short texts will be typed into a Textbox to then launch/go to the target (similar to the "Run" program in Windows).

    Read the article

  • Another Marketing Conference, part one – the best morning sessions.

    - by Roger Hart
    Yesterday I went to Another Marketing Conference. I honestly can’t tell if the title is just tipping over into smug, but in the balance of things that doesn’t matter, because it was a good conference. There was an enjoyable blend of theoretical and practical, and enough inter-disciplinary spread to keep my inner dilettante grinning from ear to ear. Sure, there was a bumpy bit in the middle, with two back-to-back sales pitches and a rather thin overview of the state of the web. But the signal:noise ratio at AMC2012 was impressively high. Here’s the first part of my write-up of the sessions. It’s a bit of a mammoth. It’s also a bit of a mash-up of what was said and what I thought about it. I’ll add links to the videos and slides from the sessions as they become available. Although it was in the morning session, I’ve not included Vanessa Northam’s session on the power of internal comms to build brand ambassadors. It’ll be in the next roundup, as this is already pushing 2.5k words. First, the important stuff. I was keeping a tally, and nobody said “synergy” or “leverage”. I did, however, hear the term “marketeers” six times. Shame on you – you know who you are. 1 – Branding in a post-digital world, Graham Hales This initially looked like being a sales presentation for Interbrand, but Graham pulled it out of the bag a few minutes in. He introduced a model for brand management that was essentially Plan >> Do >> Check >> Act, with Do and Check rolled up together, and went on to stress that this looks like on overall business management model for a reason. Brand has to be part of your overall business strategy and metrics if you’re going to care about it at all. This was the first iteration of what proved to be one of the event’s emergent themes: do it throughout the stack or don’t bother. Graham went on to remind us that brands, in so far as they are owned at all, are owned by and co-created with our customers. Advertising can offer a message to customers, but they provide the expression of a brand. This was a preface to talking about an increasingly chaotic marketplace, with increasingly hard-to-manage purchase processes. Services like Amazon reviews and TripAdvisor (four presenters would make this point) saturate customers with information, and give them a kind of vigilante power to comment on and define brands. Consequentially, they experience a number of “moments of deflection” in our sales funnels. Our control is lessened, and failure to engage can negatively-impact buying decisions increasingly poorly. The clearest example given was the failure of NatWest’s “caring bank” campaign, where staff in branches, customer support, and online presences didn’t align. A discontinuity of experience basically made the campaign worthless, and disgruntled customers talked about it loudly on social media. This in turn presented an opportunity to engage and show caring, but that wasn’t taken. What I took away was that brand (co)creation is ongoing and needs monitoring and metrics. But reciprocally, given you get what you measure, strategy and metrics must include brand if any kind of branding is to work at all. Campaigns and messages must permeate product and service design. What that doesn’t mean (and Graham didn’t say it did) is putting Marketing at the top of the pyramid, and having them bawl demands at Product Management, Support, and Development like an entitled toddler. It’s going to have to be collaborative, and session 6 on internal comms handled this really well. The main thing missing here was substantiating data, and the main question I found myself chewing on was: if we’re building brands collaboratively and in the open, what about the cultural politics of trolling? 2 – Challenging our core beliefs about human behaviour, Mark Earls This was definitely the best show of the day. It was also some of the best content. Mark talked us through nudging, behavioural economics, and some key misconceptions around decision making. Basically, people aren’t rational, they’re petty, reactive, emotional sacks of meat, and they’ll go where they’re led. Comforting stuff. Examples given were the spread of the London Riots and the “discovery” of the mountains of Kong, and the popularity of Susan Boyle, which, in turn made me think about Per Mollerup’s concept of “social wayshowing”. Mark boiled his thoughts down into four key points which I completely failed to write down word for word: People do, then think – Changing minds to change behaviour doesn’t work. Post-rationalization rules the day. See also: mere exposure effects. Spock < Kirk - Emotional/intuitive comes first, then we rationalize impulses. The non-thinking, emotive, reactive processes run much faster than the deliberative ones. People are not really rational decision makers, so  intervening with information may not be appropriate. Maximisers or satisficers? – Related to the last point. People do not consistently, rationally, maximise. When faced with an abundance of choice, they prefer to satisfice than evaluate, and will often follow social leads rather than think. Things tend to converge – Behaviour trends to a consensus normal. When faced with choices people overwhelmingly just do what they see others doing. Humans are extraordinarily good at mirroring behaviours and receiving influence. People “outsource the cognitive load” of choices to the crowd. Mark’s headline quote was probably “the real influence happens at the table next to you”. Reference examples, word of mouth, and social influence are tremendously important, and so talking about product experiences may be more important than talking about products. This reminded me of Kathy Sierra’s “creating bad-ass users” concept of designing to make people more awesome rather than products they like. If we can expose user-awesome, and make sharing easy, we can normalise the behaviours we want. If we normalize the behaviours we want, people should make and post-rationalize the buying decisions we want.  Where we need to be: “A bigger boy made me do it” Where we are: “a wizard did it and ran away” However, it’s worth bearing in mind that some purchasing decisions are personal and informed rather than social and reactive. There’s a quadrant diagram, in fact. What was really interesting, though, towards the end of the talk, was some advice for working out how social your products might be. The standard technology adoption lifecycle graph is essentially about social product diffusion. So this idea isn’t really new. Geoffrey Moore’s “chasm” idea may not strictly apply. However, his concepts of beachheads and reference segments are exactly what is required to normalize and thus enable purchase decisions (behaviour change). The final thing is that in only very few categories does a better product actually affect purchase decision. Where the choice is personal and informed, this is true. But where it’s personal and impulsive, or in any way social, “better” is trumped by popularity, endorsement, or “point of sale salience”. UX, UCD, and e-commerce know this to be true. A better (and easier) experience will always beat “more features”. Easy to use, and easy to observe being used will beat “what the user says they want”. This made me think about the astounding stickiness of rational fallacies, “common sense” and the pathological willful simplifications of the media. Rational fallacies seem like they’re basically the heuristics we use for post-rationalization. If I were profoundly grimy and cynical, I’d suggest deploying a boat-load in our messaging, to see if they’re really as sticky and appealing as they look. 4 – Changing behaviour through communication, Stephen Donajgrodzki This was a fantastic follow up to Mark’s session. Stephen basically talked us through some tactics used in public information/health comms that implement the kind of behavioural theory Mark introduced. The session was largely about how to get people to do (good) things they’re predisposed not to do, and how communication can (and can’t) make positive interventions. A couple of things stood out, in particular “implementation intentions” and how they can be linked to goals. For example, in order to get people to check and test their smoke alarms (a goal intention, rarely actualized  an information campaign will attempt to link this activity to the clocks going back or forward (a strong implementation intention, well-actualized). The talk reinforced the idea that making behaviour changes easy and visible normalizes them and makes them more likely to succeed. To do this, they have to be embodied throughout a product and service cycle. Experiential disconnects undermine the normalization. So campaigns, products, and customer interactions must be aligned. This is underscored by the second section of the presentation, which talked about interventions and pre-conditions for change. Taking the examples of drug addiction and stopping smoking, Stephen showed us a framework for attempting (and succeeding or failing in) behaviour change. He noted that when the change is something people fundamentally want to do, and that is easy, this gets a to simpler. Coordinated, easily-observed environmental pressures create preconditions for change and build motivation. (price, pub smoking ban, ad campaigns, friend quitting, declining social acceptability) A triggering even leads to a change attempt. (getting a cold and panicking about how bad the cough is) Interventions can be made to enable an attempt (NHS services, public information, nicotine patches) If it succeeds – yay. If it fails, there’s strong negative enforcement. Triggering events seem largely personal, but messaging can intervene in the creation of preconditions and in supporting decisions. Stephen talked more about systems of thinking and “bounded rationality”. The idea being that to enable change you need to break through “automatic” thinking into “reflective” thinking. Disruption and emotion are great tools for this, but that is only the start of the process. It occurs to me that a great deal of market research is focused on determining triggers rather than analysing necessary preconditions. Although they are presumably related. The final section talked about setting goals. Marketing goals are often seen as deriving directly from business goals. However, marketing may be unable to deliver on these directly where decision and behaviour-change processes are involved. In those cases, marketing and communication goals should be to create preconditions. They should also consider priming and norms. Content marketing and brand awareness are good first steps here, as brands can be heuristics in decision making for choice-saturated consumers, or those seeking education. 5 – The power of engaged communities and how to build them, Harriet Minter (the Guardian) The meat of this was that you need to let communities define and establish themselves, and be quick to react to their needs. Harriet had been in charge of building the Guardian’s community sites, and learned a lot about how they come together, stabilize  grow, and react. Crucially, they can’t be about sales or push messaging. A community is not just an audience. It’s essential to start with what this particular segment or tribe are interested in, then what they want to hear. Eventually you can consider – in light of this – what they might want to buy, but you can’t start with the product. A community won’t cohere around one you’re pushing. Her tips for community building were (again, sorry, not verbatim): Set goals Have some targets. Community building sounds vague and fluffy, but you can have (and adjust) concrete goals. Think like a start-up This is the “lean” stuff. Try things, fail quickly, respond. Don’t restrict platforms Let the audience choose them, and be aware of their differences. For example, LinkedIn is very different to Twitter. Track your stats Related to the first point. Keeping an eye on the numbers lets you respond. They should be qualified, however. If you want a community of enterprise decision makers, headcount alone may be a bad metric – have you got CIOs, or just people who want to get jobs by mingling with CIOs? Build brand advocates Do things to involve people and make them awesome, and they’ll cheer-lead for you. The last part really got my attention. Little bits of drive-by kindness go a long way. But more than that, genuinely helping people turns them into powerful advocates. Harriet gave an example of the Guardian engaging with an aspiring journalist on its Q&A forums. Through a series of serendipitous encounters he became a BBC producer, and now enthusiastically speaks up for the Guardian community sites. Cultivating many small, authentic, influential voices may have a better pay-off than schmoozing the big guys. This could be particularly important in the context of Mark and Stephen’s models of social, endorsement-led, and example-led decision making. There’s a lot here I haven’t covered, and it may be worth some follow-up on community building. Thoughts I was quite sceptical of nudge theory and behavioural economics. First off it sounds too good to be true, and second it sounds too sinister to permit. But I haven’t done the background reading. So I’m going to, and if it seems to hold real water, and if it’s possible to do it ethically (Stephen’s presentations suggests it may be) then it’s probably worth exploring. The message seemed to be: change what people do, and they’ll work out why afterwards. Moreover, the people around them will do it too. Make the things you want them to do extraordinarily easy and very, very visible. Normalize and support the decisions you want them to make, and they’ll make them. In practice this means not talking about the thing, but showing the user-awesome. Glib? Perhaps. But it feels worth considering. Also, if I ever run a marketing conference, I’m going to ban speakers from using examples from Apple. Quite apart from not being consistently generalizable, it’s becoming an irritating cliché.

    Read the article

  • SQLAuthority News – Job Interviewing the Right Way (and for the Right Reasons) – Guest Post by Feodor Georgiev

    - by pinaldave
    Feodor Georgiev is a SQL Server database specialist with extensive experience of thinking both within and outside the box. He has wide experience of different systems and solutions in the fields of architecture, scalability, performance, etc. Feodor has experience with SQL Server 2000 and later versions, and is certified in SQL Server 2008. Feodor has written excellent article on Job Interviewing the Right Way. Here is his article in his own language. A while back I was thinking to start a blog post series on interviewing and employing IT personnel. At that time I had just read the ‘Smart and gets things done’ book (http://www.joelonsoftware.com/items/2007/06/05.html) and I was hyped up on some debatable topics regarding finding and employing the best people in the branch. I have no problem with hiring the best of the best; it’s just the definition of ‘the best of the best’ that makes things a bit more complicated. One of the fundamental books one can read on the topic of interviewing is the one mentioned above. If you have not read it, then you must do so; not because it contains the ultimate truth, and not because it gives the answers to most questions on the subject, but because the book contains an extensive set of questions about interviewing and employing people. Of course, a big part of these questions have different answers, depending on location, culture, available funds and so on. (What works in the US may not necessarily work in the Nordic countries or India, or it may work in a different way). The only thing that is valid regardless of any external factor is this: curiosity. In my belief there are two kinds of people – curious and not-so-curious; regardless of profession. Think about it – professional success is directly proportional to the individual’s curiosity + time of active experience in the field. (I say ‘active experience’ because vacations and any distractions do not count as experience :)  ) So, curiosity is the factor which will distinguish a good employee from the not-so-good one. But let’s shift our attention to something else for now: a few tips and tricks for successful interviews. Tip and trick #1: get your priorities straight. Your status usually dictates your priorities; for example, if the person looking for a job has just relocated to a new country, they might tend to ignore some of their priorities and overload others. In other words, setting priorities straight means to define the personal criteria by which the interview process is lead. For example, similar to the following questions can help define the criteria for someone looking for a job: How badly do I need a (any) job? Is it more important to work in a clean and quiet environment or is it important to get paid well (or both, if possible)? And so on… Furthermore, before going to the interview, the candidate should have a list of priorities, sorted by the most importance: e.g. I want a quiet environment, x amount of money, great helping boss, a desk next to a window and so on. Also it is a good idea to be prepared and know which factors can be compromised and to what extent. Tip and trick #2: the interview is a two-way street. A job candidate should not forget that the interview process is not a one-way street. What I mean by this is that while the employer is interviewing the potential candidate, the job seeker should not miss the chance to interview the employer. Usually, the employer and the candidate will meet for an interview and talk about a variety of topics. In a quality interview the candidate will be presented to key members of the team and will have the opportunity to ask them questions. By asking the right questions both parties will define their opinion about each other. For example, if the candidate talks to one of the potential bosses during the interview process and they notice that the potential manager has a hard time formulating a question, then it is up to the candidate to decide whether working with such person is a red flag for them. There are as many interview processes out there as there are companies and each one is different. Some bigger companies and corporates can afford pre-selection processes, 3 or even 4 stages of interviews, small companies usually settle with one interview. Some companies even give cognitive tests on the interview. Why not? In his book Joel suggests that a good candidate should be pampered and spoiled beyond belief with a week-long vacation in New York, fancy hotels, food and who knows what. For all I can imagine, an interview might even take place at the top of the Eifel tower (right, Mr. Joel, right?) I doubt, however, that this is the optimal way to capture the attention of a good employee. The ‘curiosity’ topic What I have learned so far in my professional experience is that opinions can be subjective. Plus, opinions on technology subjects can also be subjective. According to Joel, only hiring the best of the best is worth it. If you ask me, there is no such thing as best of the best, simply because human nature (well, aside from some physical limitations, like putting your pants on through your head :) ) has no boundaries. And why would it have boundaries? I have seen many curious and interesting people, naturally good at technology, though uninterested in it as one  can possibly be; I have also seen plenty of people interested in technology, who (in an ideal world) should have stayed far from it. At any rate, all of this sums up at the end to the ‘supply and demand’ factor. The interview process big-bang boils down to this: If there is a mutual benefit for both the employer and the potential employee to work together, then it all sorts out nicely. If there is no benefit, then it is much harder to get to a common place. Tip and trick #3: word-of-mouth is worth a thousand words Here I would just mention that the best thing a job candidate can get during the interview process is access to future team members or other employees of the new company. Nowadays the world has become quite small and everyone knows everyone. Look at LinkedIn, look at other professional networks and you will realize how small the world really is. Knowing people is a good way to become more approachable and to approach them. Tip and trick #4: Be confident. It is true that for some people confidence is as natural as breathing and others have to work hard to express it. Confidence is, however, a key factor in convincing the other side (potential employer or employee) that there is a great chance for success by working together. But it cannot get you very far if it’s not backed up by talent, curiosity and knowledge. Tip and trick #5: The right reasons What really bothers me in Sweden (and I am sure that there are similar situations in other countries) is that there is a tendency to fill quotas and to filter out candidates by criteria different from their skill and knowledge. In job ads I see quite often the phrases ‘positive thinker’, ‘team player’ and many similar hints about personality features. So my guess here is that discrimination has evolved to a new level. Let me clear up the definition of discrimination: ‘unfair treatment of a person or group on the basis of prejudice’. And prejudice is the ‘partiality that prevents objective consideration of an issue or situation’. In other words, there is not much difference whether a job candidate is filtered out by race, gender or by personality features – it is all a bad habit. And in reality, there is no proven correlation between the technology knowledge paired with skills and the personal features (gender, race, age, optimism). It is true that a significantly greater number of Darwin awards were given to men than to women, but I am sure that somewhere there is a paper or theory explaining the genetics behind this. J This topic actually brings to mind one of my favorite work related stories. A while back I was working for a big company with many teams involved in their processes. One of the teams was occupying 2 rooms – one had the team members and was full of light, colorful posters, chit-chats and giggles, whereas the other room was dark, lighted only by a single monitor with a quiet person in front of it. Later on I realized that the ‘dark room’ person was the guru and the ultimate problem-solving-brain who did not like the chats and giggles and hence was in a separate room. In reality, all severe problems which the chatty and cheerful team members could not solve and all emergencies were directed to ‘the dark room’. And thus all worked out well. The moral of the story: Personality has nothing to do with technology knowledge and skills. End of story. Summary: I’d like to stress the fact that there is no ultimately perfect candidate for a job, and there is no such thing as ‘best-of-the-best’. From my personal experience, the main criteria by which I measure people (co-workers and bosses) is the curiosity factor; I know from experience that the more curious and inventive a person is, the better chances there are for great achievements in their field. Related stories: (for extra credit) 1) Get your priorities straight. A while back as a consultant I was working for a few days at a time at different offices and for different clients, and so I was able to compare and analyze the work environments. There were two different places which I compared and recently I asked a friend of mine the following question: “Which one would you prefer as a work environment: a noisy office full of people, or a quiet office full of faulty smells because the office is rarely cleaned?” My friend was puzzled for a while, thought about it and said: “Hmm, you are talking about two different kinds of pollution… I will probably choose the second, since I can clean the workplace myself a bit…” 2) The interview is a two-way street. One time, during a job interview, I met a potential boss that had a hard time phrasing a question. At that particular time it was clear to me that I would not have liked to work under this person. According to my work religion, the properly asked question contains at least half of the answer. And if I work with someone who cannot ask a question… then I’d be doing double or triple work. At another interview, after the technical part with the team leader of the department, I was introduced to one of the team members and we were left alone for 5 minutes. I immediately jumped on the occasion and asked the blunt question: ‘What have you learned here for the past year and how do you like your job?’ The team member looked at me and said ‘Nothing really. I like playing with my cats at home, so I am out of here at 5pm and I don’t have time for much.’ I was disappointed at the time and I did not take the job offer. I wasn’t that shocked a few months later when the company went bankrupt. 3) The right reasons to take a job: personality check. A while back I was asked to serve as a job reference for a coworker. I agreed, and after some weeks I got a phone call from the company where my colleague was applying for a job. The conversation started with the manager’s question about my colleague’s personality and about their social skills. (You can probably guess what my internal reaction was… J ) So, after 30 minutes of pouring common sense into the interviewer’s head, we finally agreed on the fact that a shy or quiet personality has nothing to do with work skills and knowledge. Some years down the road my former colleague is taking the manager’s position as the manager is demoted to a different department. Reference: Feodor Georgiev, Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, Readers Contribution, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

< Previous Page | 375 376 377 378 379 380 381 382  | Next Page >