Search Results

Search found 14910 results on 597 pages for 'programs and features'.

Page 591/597 | < Previous Page | 587 588 589 590 591 592 593 594 595 596 597  | Next Page >

  • What strategy do you use for package naming in Java projects and why?

    - by Tim Visher
    I thought about this awhile ago and it recently resurfaced as my shop is doing its first real Java web app. As an intro, I see two main package naming strategies. (To be clear, I'm not referring to the whole 'domain.company.project' part of this, I'm talking about the package convention beneath that.) Anyway, the package naming conventions that I see are as follows: Functional: Naming your packages according to their function architecturally rather than their identity according to the business domain. Another term for this might be naming according to 'layer'. So, you'd have a *.ui package and a *.domain package and a *.orm package. Your packages are horizontal slices rather than vertical. This is much more common than logical naming. In fact, I don't believe I've ever seen or heard of a project that does this. This of course makes me leery (sort of like thinking that you've come up with a solution to an NP problem) as I'm not terribly smart and I assume everyone must have great reasons for doing it the way they do. On the other hand, I'm not opposed to people just missing the elephant in the room and I've never heard a an actual argument for doing package naming this way. It just seems to be the de facto standard. Logical: Naming your packages according to their business domain identity and putting every class that has to do with that vertical slice of functionality into that package. I have never seen or heard of this, as I mentioned before, but it makes a ton of sense to me. I tend to approach systems vertically rather than horizontally. I want to go in and develop the Order Processing system, not the data access layer. Obviously, there's a good chance that I'll touch the data access layer in the development of that system, but the point is that I don't think of it that way. What this means, of course, is that when I receive a change order or want to implement some new feature, it'd be nice to not have to go fishing around in a bunch of packages in order to find all the related classes. Instead, I just look in the X package because what I'm doing has to do with X. From a development standpoint, I see it as a major win to have your packages document your business domain rather than your architecture. I feel like the domain is almost always the part of the system that's harder to grok where as the system's architecture, especially at this point, is almost becoming mundane in its implementation. The fact that I can come to a system with this type of naming convention and instantly from the naming of the packages know that it deals with orders, customers, enterprises, products, etc. seems pretty darn handy. It seems like this would allow you to take much better advantage of Java's access modifiers. This allows you to much more cleanly define interfaces into subsystems rather than into layers of the system. So if you have an orders subsystem that you want to be transparently persistent, you could in theory just never let anything else know that it's persistent by not having to create public interfaces to its persistence classes in the dao layer and instead packaging the dao class in with only the classes it deals with. Obviously, if you wanted to expose this functionality, you could provide an interface for it or make it public. It just seems like you lose a lot of this by having a vertical slice of your system's features split across multiple packages. I suppose one disadvantage that I can see is that it does make ripping out layers a little bit more difficult. Instead of just deleting or renaming a package and then dropping a new one in place with an alternate technology, you have to go in and change all of the classes in all of the packages. However, I don't see this is a big deal. It may be from a lack of experience, but I have to imagine that the amount of times you swap out technologies pales in comparison to the amount of times you go in and edit vertical feature slices within your system. So I guess the question then would go out to you, how do you name your packages and why? Please understand that I don't necessarily think that I've stumbled onto the golden goose or something here. I'm pretty new to all this with mostly academic experience. However, I can't spot the holes in my reasoning so I'm hoping you all can so that I can move on. Thanks in advance!

    Read the article

  • [SOLVED] Iphone NSXMLParser NSCFString memory leak

    - by atticusalien
    I am building an app that parses an rss feed. In the app there are two different types of feeds with different names for the elements in the feed, so I have created an NSXMLParser NSObject that takes the name of the elements of each feed before parsing. Here is my code: NewsFeedParser.h #import @interface NewsFeedParser : NSObject { NSInteger NewsSelectedCategory; NSXMLParser *NSXMLNewsParser; NSMutableArray *newsCategories; NSMutableDictionary *NewsItem; NSMutableString *NewsCurrentElement, *NewsCurrentElement1, *NewsCurrentElement2, *NewsCurrentElement3; NSString *NewsItemType, *NewsElement1, *NewsElement2, *NewsElement3; NSInteger NewsNumElements; } - (void) parseXMLFileAtURL:(NSString *)URL; @property(nonatomic, retain) NSString *NewsItemType; @property(nonatomic, retain) NSString *NewsElement1; @property(nonatomic, retain) NSString *NewsElement2; @property(nonatomic, retain) NSString *NewsElement3; @property(nonatomic, retain) NSMutableArray *newsCategories; @property(assign, nonatomic) NSInteger NewsNumElements; @end NewsFeedParser.m #import "NewsFeedParser.h" @implementation NewsFeedParser @synthesize NewsItemType; @synthesize NewsElement1; @synthesize NewsElement2; @synthesize NewsElement3; @synthesize newsCategories; @synthesize NewsNumElements; - (void)parserDidStartDocument:(NSXMLParser *)parser{ } - (void)parseXMLFileAtURL:(NSString *)URL { newsCategories = [[NSMutableArray alloc] init]; URL = [URL stringByReplacingOccurrencesOfString:@" " withString:@""]; URL = [URL stringByReplacingOccurrencesOfString:@"\n" withString:@""]; URL = [URL stringByReplacingOccurrencesOfString:@" " withString:@""]; //you must then convert the path to a proper NSURL or it won't work NSURL *xmlURL = [NSURL URLWithString:URL]; // here, for some reason you have to use NSClassFromString when trying to alloc NSXMLParser, otherwise you will get an object not found error // this may be necessary only for the toolchain [[NSURLCache sharedURLCache] setMemoryCapacity:0]; [[NSURLCache sharedURLCache] setDiskCapacity:0]; NSXMLNewsParser = [[NSXMLParser alloc] initWithContentsOfURL:xmlURL]; // Set self as the delegate of the parser so that it will receive the parser delegate methods callbacks. [NSXMLNewsParser setDelegate:self]; // Depending on the XML document you're parsing, you may want to enable these features of NSXMLParser. [NSXMLNewsParser setShouldProcessNamespaces:NO]; [NSXMLNewsParser setShouldReportNamespacePrefixes:NO]; [NSXMLNewsParser setShouldResolveExternalEntities:NO]; [NSXMLNewsParser parse]; [NSXMLNewsParser release]; } - (void)parser:(NSXMLParser *)parser parseErrorOccurred:(NSError *)parseError { NSString * errorString = [NSString stringWithFormat:@"Unable to download story feed from web site (Error code %i )", [parseError code]]; NSLog(@"error parsing XML: %@", errorString); UIAlertView * errorAlert = [[UIAlertView alloc] initWithTitle:@"Error loading content" message:errorString delegate:self cancelButtonTitle:@"OK" otherButtonTitles:nil]; [errorAlert show]; [errorAlert release]; [errorString release]; } - (void)parser:(NSXMLParser *)parser didStartElement:(NSString *)elementName namespaceURI:(NSString *)namespaceURI qualifiedName:(NSString *)qName attributes:(NSDictionary *)attributeDict{ NewsCurrentElement = [elementName copy]; if ([elementName isEqualToString:NewsItemType]) { // clear out our story item caches... NewsItem = [[NSMutableDictionary alloc] init]; NewsCurrentElement1 = [[NSMutableString alloc] init]; NewsCurrentElement2 = [[NSMutableString alloc] init]; if(NewsNumElements == 3) { NewsCurrentElement3 = [[NSMutableString alloc] init]; } } } - (void)parser:(NSXMLParser *)parser didEndElement:(NSString *)elementName namespaceURI:(NSString *)namespaceURI qualifiedName:(NSString *)qName{ if ([elementName isEqualToString:NewsItemType]) { // save values to an item, then store that item into the array... [NewsItem setObject:NewsCurrentElement1 forKey:NewsElement1]; [NewsItem setObject:NewsCurrentElement2 forKey:NewsElement2]; if(NewsNumElements == 3) { [NewsItem setObject:NewsCurrentElement3 forKey:NewsElement3]; } [newsCategories addObject:[[NewsItem copy] autorelease]]; [NewsCurrentElement release]; [NewsCurrentElement1 release]; [NewsCurrentElement2 release]; if(NewsNumElements == 3) { [NewsCurrentElement3 release]; } [NewsItem release]; } } - (void)parser:(NSXMLParser *)parser foundCharacters:(NSString *)string { //NSLog(@"found characters: %@", string); // save the characters for the current item... if ([NewsCurrentElement isEqualToString:NewsElement1]) { [NewsCurrentElement1 appendString:string]; } else if ([NewsCurrentElement isEqualToString:NewsElement2]) { [NewsCurrentElement2 appendString:string]; } else if (NewsNumElements == 3 && [NewsCurrentElement isEqualToString:NewsElement3]) { [NewsCurrentElement3 appendString:string]; } } - (void)dealloc { [super dealloc]; [newsCategories release]; [NewsItemType release]; [NewsElement1 release]; [NewsElement2 release]; [NewsElement3 release]; } When I create an instance of the class I do like so: NewsFeedParser *categoriesParser = [[NewsFeedParser alloc] init]; if(newsCat == 0) { categoriesParser.NewsItemType = @"article"; categoriesParser.NewsElement1 = @"category"; categoriesParser.NewsElement2 = @"catid"; } else { categoriesParser.NewsItemType = @"article"; categoriesParser.NewsElement1 = @"category"; categoriesParser.NewsElement2 = @"feedUrl"; } [categoriesParser parseXMLFileAtURL:feedUrl]; newsCategories = [[NSMutableArray alloc] initWithArray:categoriesParser.newsCategories copyItems:YES]; [self.tableView reloadData]; [categoriesParser release]; If I run the app with the leaks instrument, the leaks point to the [NSXMLNewsParser parse] call in the NewsFeedParser.m. Here is a screen shot of the Leaks instrument with the NSCFStrings leaking: http://img139.imageshack.us/img139/3997/leaks.png For the life of me I can't figure out where these leaks are coming from. Any help would be greatly appreciated.

    Read the article

  • How-to configure Spring Social via XML

    - by Matthias Steiner
    I spend a few hours trying to get Twitter integration to work with Spring Social using the XML configuration approach. All the examples I could find on the web (and on stackoverflow) always use the @Config approach as shown in the samples For whatever reason the bean definition to get an instance to the twitter API throws an AOP exception: Caused by: java.lang.IllegalStateException: Cannot create scoped proxy for bean 'scopedTarget.twitter': Target type could not be determined at the time of proxy creation. Here's the complete config file I have: <?xml version="1.0" encoding="UTF-8"?> <beans xmlns="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:jaxrs="http://cxf.apache.org/jaxrs" xmlns:context="http://www.springframework.org/schema/context" xmlns:util="http://www.springframework.org/schema/util" xmlns:cxf="http://cxf.apache.org/core" xmlns:aop="http://www.springframework.org/schema/aop" xmlns:jee="http://www.springframework.org/schema/jee" xmlns:mvc="http://www.springframework.org/schema/mvc" xmlns:jdbc="http://www.springframework.org/schema/jdbc" xsi:schemaLocation=" http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.1.xsd http://cxf.apache.org/jaxrs http://cxf.apache.org/schemas/jaxrs.xsd http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context.xsd http://www.springframework.org/schema/util http://www.springframework.org/schema/util/spring-util-3.1.xsd http://cxf.apache.org/core http://cxf.apache.org/schemas/core.xsd http://www.springframework.org/schema/aop http://www.springframework.org/schema/aop/spring-aop-3.1.xsd http://www.springframework.org/schema/jee http://www.springframework.org/schema/jee/spring-jee-3.1.xsd http://www.springframework.org/schema/mvc http://www.springframework.org/schema/mvc/spring-mvc-3.1.xsd http://www.springframework.org/schema/jdbc http://www.springframework.org/schema/jdbc/spring-jdbc-3.1.xsd"> <import resource="classpath:META-INF/cxf/cxf.xml" /> <import resource="classpath:META-INF/cxf/cxf-servlet.xml" /> <jee:jndi-lookup id="dataSource" jndi-name="java:comp/env/jdbc/DefaultDB" /> <!-- initialize DB required to store user auth tokens --> <jdbc:initialize-database data-source="dataSource" ignore-failures="ALL"> <jdbc:script location="classpath:/org/springframework/social/connect/jdbc/JdbcUsersConnectionRepository.sql"/> </jdbc:initialize-database> <bean id="connectionFactoryLocator" class="org.springframework.social.connect.support.ConnectionFactoryRegistry"> <property name="connectionFactories"> <list> <ref bean="twitterConnectFactory" /> </list> </property> </bean> <bean id="twitterConnectFactory" class="org.springframework.social.twitter.connect.TwitterConnectionFactory"> <constructor-arg value="xyz" /> <constructor-arg value="xzy" /> </bean> <bean id="usersConnectionRepository" class="org.springframework.social.connect.jdbc.JdbcUsersConnectionRepository"> <constructor-arg ref="dataSource" /> <constructor-arg ref="connectionFactoryLocator" /> <constructor-arg ref="textEncryptor" /> </bean> <bean id="connectionRepository" factory-method="createConnectionRepository" factory-bean="usersConnectionRepository" scope="request"> <constructor-arg value="#{request.userPrincipal.name}" /> <aop:scoped-proxy proxy-target-class="false" /> </bean> <bean id="twitter" factory-method="?ndPrimaryConnection" factory-bean="connectionRepository" scope="request" depends-on="connectionRepository"> <constructor-arg value="org.springframework.social.twitter.api.Twitter" /> <aop:scoped-proxy proxy-target-class="false" /> </bean> <bean id="textEncryptor" class="org.springframework.security.crypto.encrypt.Encryptors" factory-method="noOpText" /> <bean id="connectController" class="org.springframework.social.connect.web.ConnectController"> <constructor-arg ref="connectionFactoryLocator"/> <constructor-arg ref="connectionRepository"/> <property name="applicationUrl" value="https://socialscn.int.netweaver.ondemand.com/socialspringdemo" /> </bean> <bean id="signInAdapter" class="com.sap.netweaver.cloud.demo.social.SimpleSignInAdapter" /> </beans> What puzzles me is that the connectionRepositoryinstantiation works perfectly fine (I commented-out the twitter bean and tested the code!) ?!? It uses the same features: request scope and interface AOP proxy and works, but the twitter bean instantiation fails ?!? The spring social config code looks as follows (I can not see any differences, can you?): @Configuration public class SocialConfig { @Inject private Environment environment; @Inject private DataSource dataSource; @Bean @Scope(value="singleton", proxyMode=ScopedProxyMode.INTERFACES) public ConnectionFactoryLocator connectionFactoryLocator() { ConnectionFactoryRegistry registry = new ConnectionFactoryRegistry(); registry.addConnectionFactory(new TwitterConnectionFactory(environment.getProperty("twitter.consumerKey"), environment.getProperty("twitter.consumerSecret"))); return registry; } @Bean @Scope(value="singleton", proxyMode=ScopedProxyMode.INTERFACES) public UsersConnectionRepository usersConnectionRepository() { return new JdbcUsersConnectionRepository(dataSource, connectionFactoryLocator(), Encryptors.noOpText()); } @Bean @Scope(value="request", proxyMode=ScopedProxyMode.INTERFACES) public ConnectionRepository connectionRepository() { Authentication authentication = SecurityContextHolder.getContext().getAuthentication(); if (authentication == null) { throw new IllegalStateException("Unable to get a ConnectionRepository: no user signed in"); } return usersConnectionRepository().createConnectionRepository(authentication.getName()); } @Bean @Scope(value="request", proxyMode=ScopedProxyMode.INTERFACES) public Twitter twitter() { Connection<Twitter> twitter = connectionRepository().findPrimaryConnection(Twitter.class); return twitter != null ? twitter.getApi() : new TwitterTemplate(); } @Bean public ConnectController connectController() { ConnectController connectController = new ConnectController(connectionFactoryLocator(), connectionRepository()); connectController.addInterceptor(new PostToWallAfterConnectInterceptor()); connectController.addInterceptor(new TweetAfterConnectInterceptor()); return connectController; } @Bean public ProviderSignInController providerSignInController(RequestCache requestCache) { return new ProviderSignInController(connectionFactoryLocator(), usersConnectionRepository(), new SimpleSignInAdapter(requestCache)); } } Any help/pointers would be appreciated!!! Cheers, Matthias

    Read the article

  • Why html frame behave differently in Firefox and IE8 ?

    - by Frank
    I use html frame on my webiste, it's been running for I while, usually I only use Firefox to surf the net, my site looks and functions ok, but today I suddenly found IE8 has a problem with the frame on my site, if I click on the top menu items, it's supposed to display the content in the lower part of the frame, it does this correctly in Firefox, but in IE8, it displays the content in the upper part of the frame and replaces the menu items. In order to give more details, I'll include a simplified version of my html pages, at the top level there are two items, an index.html page and a file directory, all the pages except the index.html are in the directory, so it looks like this : index.html Dir_Docs 00_Home.html 00_Install_Java.html 00_Top_Menu.html 01_Home_Menu.html 01_Install_Java_Menu.html 10_Home_Welcome.html 10_How_To_Install_Java.html [ index.html ] <Html> <Head><Title>Java Applications : Tv_Panel, Java_Sound, Biz Manager and Web Academy</Title></Head> <Frameset Rows="36,*" Border=5 Bordercolor=#006B9F> <Frame Src=Dir_Docs/00_Top_Menu.html Frameborder=YES Scrolling=no Marginheight=1 Marginwidth=1> <Frame Src=Dir_Docs/00_Home.html Name=lower_frame Marginheight=1 Marginwidth=1> </Frameset> </Html> [ 00_Home.html ] <Html> <Head><Title>NMJava Application Development</Title></Head> <Frameset Cols="217,*" Align=center BorderColor="#006B9F"> <Frame Src=01_Home_Menu.html Frameborder=YES Name=side_menu Marginheight=1 Marginwidth=1> <Frame Src=10_Home_Welcome.html Name=content Marginheight=1 Marginwidth=1> </Frameset> </Html> [ 00_Install_Java.html ] <Html> <Head> <Title>Install Java</Title> </Head> <Frameset Cols="217,*" Align=center BorderColor="#006B9F"> <Frame Src=01_Install_Java_Menu.html Frameborder=YES Name=side_menu Marginheight=1 Marginwidth=1> <Frame Src=10_How_To_Install_Java.html Name=content Marginheight=1 Marginwidth=1> </Frameset> </Html> [ 00_Top_Menu.html ] <Html> <Head>Top Menu</Head> <Body> <Center> <Base target=lower_frame> <Table Border=1 Cellpadding=3 Width=100%> <Tr> <Td Align=Center Bgcolor=#3366FF><A Href=00_Home.html><Font Size=4 Color=White>Home</Font></A></Td> <Td Align=Center Bgcolor=#3366FF><A Href=00_Install_Java.html><Font Size=4 Color=White>Install Java</Font></A></Td> </Tr> </Table> </Center> </Body> </Html> [ 01_Home_Menu.html ] <Html> <Head><Title>Home Menu</Title></Head> <Base Target=content> <Body Bgcolor=#7799DD> <Center> <Table Border=1 Width=100%> <Tr><Td Align=center Bgcolor=#66AAFF><A Href=10_Home_Welcome.html>Welcome</A></Td></Tr> </Table> </Center> </Body> </Html> [ 01_Install_Java_Menu.html ] <Html> <Head><Title>Install Java</Title></Head> <Base Target=content> <Body Bgcolor=#7799DD> <Center> <Table Border=1 Width=100%> <Tr><Td Align=Center Bgcolor=#66AAFF><A Href=10_How_To_Install_Java.html>How To Install Java ?</A></Td></Tr> </Table> </Center> </Body> </Html> [ 10_Home_Welcome.html ] <Html> <Head><Title>NMJava For Software Development</Title></Head> <Body> <Center> <P> <Font Size=5 Color=blue>Welcome To NMJava For Software Development</Font> <P> </Center> </Body> </Html> [ 10_How_To_Install_Java.html ] <Html> <Head> <Title>Install Java</Title> </Head> <Body> <Center> <Br> <Font Size=5 Color=#0022AE><A Href=http://java.com/en/download/index.jsp>How To Install Java ?</A></Font> <Br> <P> <Table Width=90% Cellspacing=5 Cellpadding=5> <Tr><Td><Font Color=#0022AE> You need JRE 6 (Java Runtime Environment) to run the programs on this site. You may or may not have Java already installed on your PC, you can find out by going to the following site, if you don't have the latest version, you can install/upgrade it, it's free from Sun/Oracle at :<Font Size=4> <A Href=http://java.com/en/download/index.jsp>http://java.com/en/download/index.jsp</A></Font>.<P> </Font></Td></Tr> </Table> </Center> </Body> </Html> What's wrong with them, why the two browsers behave differently, and how to fix this ? My site is at : http://nmjava.com , in case you want to see more details. Frank

    Read the article

  • Am I crazy? (How) should I create a jQuery content editor?

    - by Brendon Muir
    Ok, so I created a CMS mainly aimed at Primary Schools. It's getting fairly popular in New Zealand but the one thing I hate with a passion is the largely bad quality of in browser WYSIWYG editors. I've been using KTML (made by InterAKT which was purchased by Adobe a few years ago). In my opinion this editor does a lot of great things (image editing/management, thumbnailing and pretty good content editing). Unfortunately time has had its nasty way with this product and new browsers are beginning to break features and generally degrade the performance of this tool. It's also quite scary basing my livelihood on a defunct product! I've been hunting, in fact I regularly hunt around to see if anything has changed in the WYSIWYG arena. The closest thing I've seen that excites me is the WYSIHAT framework, but they've decided to ignore a pretty relevant editing paradigm which I'm going to outline below. This is the idea for my proposed editor, and I don't know of any existing products that can do this properly: Right, so the traditional model for editing let's say a Page in a CMS is to log into a 'back end' and click edit on the page. This will then load another screen with the editor in it and perhaps a few other fields. More advanced CMS's will maybe have several editing boxes that are for different portions of the page. Anyway, the big problem with this way of doing things is that the user is editing a document outside of the final context it will appear in. In the simplest terms, this means the page template. Many things can be wrong, e.g. the with of the editing area might be different to the width of the actual template area. The height is nearly always fixed because existing editors always seem to use IFRAMES for backward compatibility. And there are plenty of other beefs which I'm sure you're quite aware of if you're in this development area. Here's my editor utopia: You click 'Edit Page': The actual page (with its actual template) displays. Portions of the page have been marked as editable via a class name. You click on one of these areas (in my case it'd just be the big 'body' area in the middle of the template) and a editing bar drops down from the top of the screen with all your standard controls (bold, italic, insert image etc...). Iframes are never used, instead we rely on setting contentEditable to true on the DIV's in question. Firefox 2 and IE6 can go away, let's move on. You can edit the page knowing exactly how it will look when you save it. Because all the styles for this template are loaded, your headings will look correct, everything will be just dandy. Is this such a radical concept? Why are we still content with TinyMCE and that other editor that is too embarrassing to use because it sounds like a swear word!? Let's face the facts: I'm a JavaScript novice. I did once play around in this area using the Javascript Anthology from SitePoint as a guide. It was quite a cool learning experience, but they of course used the IFRAME to make their lives easier. I tried to go a different route and just use contentEditable and even tried to sidestep the native content editing routines (execCommand) and instead wrote my own. They kind of worked but there were always issues. Now we have jQuery, and a few libraries that abstract things like IE's lack of Range support. I'm wondering, am I crazy, or is it actually a good idea to try and build an editor around this editing paradigm using jQuery and relevant plugins to make the job easier? My actual questions: Where would you start? What plugins do you know of that would help the most? Is it worth it, or is there a magical project that already exists that I should join in on? What are the biggest hurdles to overcome in a project like this? Am I crazy? I hope this question has been posted on the right board. I figured it is a technical question as I'm wanting to know specific hurdles and pitfalls to watch out for and also if it is technically feasible with todays technology. Looking forward to hearing peoples thoughts and opinions.

    Read the article

  • Approaches for generic, compile-time safe lazy-load methods

    - by Aaronaught
    Suppose I have created a wrapper class like the following: public class Foo : IFoo { private readonly IFoo innerFoo; public Foo(IFoo innerFoo) { this.innerFoo = innerFoo; } public int? Bar { get; set; } public int? Baz { get; set; } } The idea here is that the innerFoo might wrap data-access methods or something similarly expensive, and I only want its GetBar and GetBaz methods to be invoked once. So I want to create another wrapper around it, which will save the values obtained on the first run. It's simple enough to do this, of course: int IFoo.GetBar() { if ((Bar == null) && (innerFoo != null)) Bar = innerFoo.GetBar(); return Bar ?? 0; } int IFoo.GetBaz() { if ((Baz == null) && (innerFoo != null)) Baz = innerFoo.GetBaz(); return Baz ?? 0; } But it gets pretty repetitive if I'm doing this with 10 different properties and 30 different wrappers. So I figured, hey, let's make this generic: T LazyLoad<T>(ref T prop, Func<IFoo, T> loader) { if ((prop == null) && (innerFoo != null)) prop = loader(innerFoo); return prop; } Which almost gets me where I want, but not quite, because you can't ref an auto-property (or any property at all). In other words, I can't write this: int IFoo.GetBar() { return LazyLoad(ref Bar, f => f.GetBar()); // <--- Won't compile } Instead, I'd have to change Bar to have an explicit backing field and write explicit getters and setters. Which is fine, except for the fact that I end up writing even more redundant code than I was writing in the first place. Then I considered the possibility of using expression trees: T LazyLoad<T>(Expression<Func<T>> propExpr, Func<IFoo, T> loader) { var memberExpression = propExpr.Body as MemberExpression; if (memberExpression != null) { // Use Reflection to inspect/set the property } } This plays nice with refactoring - it'll work great if I do this: return LazyLoad(f => f.Bar, f => f.GetBar()); But it's not actually safe, because someone less clever (i.e. myself in 3 days from now when I inevitably forget how this is implemented internally) could decide to write this instead: return LazyLoad(f => 3, f => f.GetBar()); Which is either going to crash or result in unexpected/undefined behaviour, depending on how defensively I write the LazyLoad method. So I don't really like this approach either, because it leads to the possibility of runtime errors which would have been prevented in the first attempt. It also relies on Reflection, which feels a little dirty here, even though this code is admittedly not performance-sensitive. Now I could also decide to go all-out and use DynamicProxy to do method interception and not have to write any code, and in fact I already do this in some applications. But this code is residing in a core library which many other assemblies depend on, and it seems horribly wrong to be introducing this kind of complexity at such a low level. Separating the interceptor-based implementation from the IFoo interface by putting it into its own assembly doesn't really help; the fact is that this very class is still going to be used all over the place, must be used, so this isn't one of those problems that could be trivially solved with a little DI magic. The last option I've already thought of would be to have a method like: T LazyLoad<T>(Func<T> getter, Action<T> setter, Func<IFoo, T> loader) { ... } This option is very "meh" as well - it avoids Reflection but is still error-prone, and it doesn't really reduce the repetition that much. It's almost as bad as having to write explicit getters and setters for each property. Maybe I'm just being incredibly nit-picky, but this application is still in its early stages, and it's going to grow substantially over time, and I really want to keep the code squeaky-clean. Bottom line: I'm at an impasse, looking for other ideas. Question: Is there any way to clean up the lazy-loading code at the top, such that the implementation will: Guarantee compile-time safety, like the ref version; Actually reduce the amount of code repetition, like the Expression version; and Not take on any significant additional dependencies? In other words, is there a way to do this just using regular C# language features and possibly a few small helper classes? Or am I just going to have to accept that there's a trade-off here and strike one of the above requirements from the list?

    Read the article

  • What happens when you click a button using WebRat under cucumber

    - by Peter Tillemans
    I am trying to login to a Java web application. The login page has the following html : <html> <head><title>Login Page</title></head> <body onload='document.f.j_username.focus();'> <h3>Login with Username and Password</h3> <form name='f' action='/ui/j_spring_security_check' method='POST'> <table> <tr><td>User:</td><td><input type='text' name='j_username' value=''></td></tr> <tr><td>Password:</td><td><input type='password' name='j_password'/></td></tr> <tr> <td><input type='checkbox' name='_spring_security_remember_me'/> </td> <td>Remember me on this computer.</td> </tr> <tr><td colspan='2'><input name="submit" type="submit"/></td></tr> <tr><td colspan='2'><input name="reset" type="reset"/></td></tr> </table> </form> </body> </html> I use the following script: Given /^I am logged in as (.*) with password (.*)$/ do | user, password | visit "http://localhost:8080/ui" click_link "Projects" puts "Response Body:" puts response.body assert_contain "User:" fill_in "j_username", :with => user fill_in "j_password", :with => password puts "Response Body:" puts response.body click_button puts "Response Body:" puts response.body end This gives the following in the log file : [INFO] Response Body: [INFO] <html><head><title>Login Page</title></head><body onload='document.f.j_username.focus();'> [INFO] <h3>Login with Username and Password</h3><form name='f' action='/ui/j_spring_security_check' method='POST'> [INFO] <table> [INFO] <tr><td>User:</td><td><input type='text' name='j_username' value=''></td></tr> [INFO] <tr><td>Password:</td><td><input type='password' name='j_password'/></td></tr> [INFO] <tr><td><input type='checkbox' name='_spring_security_remember_me'/></td><td>Remember me on this computer.</td></tr> [INFO] <tr><td colspan='2'><input name="submit" type="submit"/></td></tr> [INFO] <tr><td colspan='2'><input name="reset" type="reset"/></td></tr> [INFO] </table> [INFO] </form></body></html> [INFO] Response Body: [INFO] <html><head><title>Login Page</title></head><body onload='document.f.j_username.focus();'> [INFO] <h3>Login with Username and Password</h3><form name='f' action='/ui/j_spring_security_check' method='POST'> [INFO] <table> [INFO] <tr><td>User:</td><td><input type='text' name='j_username' value=''></td></tr> [INFO] <tr><td>Password:</td><td><input type='password' name='j_password'/></td></tr> [INFO] <tr><td><input type='checkbox' name='_spring_security_remember_me'/></td><td>Remember me on this computer.</td></tr> [INFO] <tr><td colspan='2'><input name="submit" type="submit"/></td></tr> [INFO] <tr><td colspan='2'><input name="reset" type="reset"/></td></tr> [INFO] </table> [INFO] </form></body></html> [INFO] Response Body: [INFO] [INFO] Given I am logged in as pti with password ptipti # features/step_definitions/authentication_tests.rb:2 So apparently the response.body disappeared after clicking the submit button. I can see from the server log files that the script does not arrive on the Project page. I am new to webrat and quite new to ruby and I am now thoroughly confused. I have no idea why the response.body is gone. I have no idea where I am. I speculated that I had to wait for the page request, but all documentation says that webrat nicely waits till all redirects, pageloads, etc are finished. (At least I think I read that). Besides I find no method to wait for the page in the webrat API. Can someone give some tips on how to proceed with debugging this?

    Read the article

  • Authoritative sources about Database vs. Flatfile decision

    - by FastAl
    <tldr>looking for a reference to a book or other undeniably authoritative source that gives reasons when you should choose a database vs. when you should choose other storage methods. I have provided an un-authoritative list of reasons about 2/3 of the way down this post.</tldr> I have a situation at my company where a database is being used where it would be better to use another solution (in this case, an auto-generated piece of source code that contains a static lookup table, searched by binary sort). Normally, a database would be an OK solution even though the problem does not require a database, e.g, none of the elements of ACID are needed, as it is read-only data, updated about every 3-5 years (also requiring other sourcecode changes), and fits in memory, and can be keyed into via binary search (a tad faster than db, but speed is not an issue). The problem is that this code runs on our enterprise server, but is shared with several PC platforms (some disconnected, some use a central DB, etc.), and parts of it are managed by multiple programming units, parts by the DBAs, parts even by mathematicians in another department, etc. These hit their own platform’s version of their databases (containing their own copy of the static data). What happens is that every implementation, every little change, something different goes wrong. There are many other issues as well. I can’t even use a flatfile, because one mode of running on our enterprise server does not have permission to read files (only databases, and of course, its own literal storage, e.g., in-source table). Of course, other parts of the system use databases in proper, less obscure manners; there is no problem with those parts. So why don’t we just change it? I don’t have administrative ability to force a change. But I’m affected because sometimes I have to help fix the problems, but mostly because it causes outages and tons of extra IT time by other programmers and d*mmit that makes me mad! The reason neither management, nor the designers of the system, can see the problem is that they propose a solution that won’t work: increase communication; implement more safeguards and standards; etc. But every time, in a different part of the already-pared-down but still multi-step processes, a few different diligent, hard-working, top performing IT personnel make a unique subtle error that causes it to fail, sometimes after the last round of testing! And in general these are not single-person failures, but understandable miscommunications. And communication at our company is actually better than most. People just don't think that's the case because they haven't dug into the matter. However, I have it on very good word from somebody with extensive formal study of sociology and psychology that the relatively small amount of less-than-proper database usage in this gigantic cross-platform multi-source, multi-language project is bureaucratically un-maintainable. Impossible. No chance. At least with Human Beings in the loop, and it can’t be automated. In addition, the management and developers who could change this, though intelligent and capable, don’t understand the rigidity of this ‘how humans are’ issue, and are not convincible on the matter. The reason putting the static data in sourcecode will solve the problem is, although the solution is less sexy than a database, it would function with no technical drawbacks; and since the sharing of sourcecode already works very well, you basically erase any database-related effort from this section of the project, along with all the drawbacks of it that are causing problems. OK, that’s the background, for the curious. I won’t be able to convince management that this is an unfixable sociological problem, and that the real solution is coding around these limits of human nature, just as you would code around a bug in a 3rd party component that you can’t change. So what I have to do is exploit the unsuitableness of the database solution, and not do it using logic, but rather authority. I am aware of many reasons, and posts on this site giving reasons for one over the other; I’m not looking for lists of reasons like these (although you can add a comment if I've miss a doozy): WHY USE A DATABASE? instead of flatfile/other DB vs. file: if you need... Random Read / Transparent search optimization Advanced / varied / customizable Searching and sorting capabilities Transaction/rollback Locks, semaphores Concurrency control / Shared users Security 1-many/m-m is easier Easy modification Scalability Load Balancing Random updates / inserts / deletes Advanced query Administrative control of design, etc. SQL / learning curve Debugging / Logging Centralized / Live Backup capabilities Cached queries / dvlp & cache execution plans Interleaved update/read Referential integrity, avoid redundant/missing/corrupt/out-of-sync data Reporting (from on olap or oltp db) / turnkey generation tools [Disadvantages:] Important to get right the first time - professional design - but only b/c it's meant to last s/w & h/w cost Usu. over a network, speed issue (best vs. best design vs. local=even then a separate process req's marshalling/netwk layers/inter-p comm) indicies and query processing can stand in the way of simple processing (vs. flatfile) WHY USE FLATFILE: If you only need... Sequential Row processing only Limited usage append only (no reading, no master key/update) Only Update the record you're reading (fixed length recs only) Too big to fit into memory If Local disk / read-ahead network connection Portability / small system Email / cut & Paste / store as document by novice - simple format Low design learning curve but high cost later WHY USE IN-MEMORY/TABLE (tables, arrays, etc.): if you need... Processing a single db/ff record that was imported Known size of data Static data if hardcoding the table Narrow, unchanging use (e.g., one program or proc) -includes a class that will be shared, but encapsulates its data manipulation Extreme speed needed / high transaction frequency Random access - but search is dependent on implementation Following are some other posts about the topic: http://stackoverflow.com/questions/1499239/database-vs-flat-text-file-what-are-some-technical-reasons-for-choosing-one-over http://stackoverflow.com/questions/332825/are-flat-file-databases-any-good http://stackoverflow.com/questions/2356851/database-vs-flat-files http://stackoverflow.com/questions/514455/databases-vs-plain-text/514530 What I’d like to know is if anybody could recommend a hard, authoritative source containing these reasons. I’m looking for a paper book I can buy, or a reputable website with whitepapers about the issue (e.g., Microsoft, IBM), not counting the user-generated content on those sites. This will have a greater change to elicit a change that I’m looking for: less wasted programmer time, and more reliable programs. Thanks very much for your help. You win a prize for reading such a large post!

    Read the article

  • How do I use texture-mapping in a simple ray tracer?

    - by fastrack20
    I am attempting to add features to a ray tracer in C++. Namely, I am trying to add texture mapping to the spheres. For simplicity, I am using an array to store the texture data. I obtained the texture data by using a hex editor and copying the correct byte values into an array in my code. This was just for my testing purposes. When the values of this array correspond to an image that is simply red, it appears to work close to what is expected except there is no shading. The bottom right of the image shows what a correct sphere should look like. This sphere's colour using one set colour, not a texture map. Another problem is that when the texture map is of something other than just one colour pixels, it turns white. My test image is a picture of water, and when it maps, it shows only one ring of bluish pixels surrounding the white colour. When this is done, it simply appears as this: Here are a few code snippets: Color getColor(const Object *object,const Ray *ray, float *t) { if (object->materialType == TEXTDIF || object->materialType == TEXTMATTE) { float distance = *t; Point pnt = ray->origin + ray->direction * distance; Point oc = object->center; Vector ve = Point(oc.x,oc.y,oc.z+1) - oc; Normalize(&ve); Vector vn = Point(oc.x,oc.y+1,oc.z) - oc; Normalize(&vn); Vector vp = pnt - oc; Normalize(&vp); double phi = acos(-vn.dot(vp)); float v = phi / M_PI; float u; float num1 = (float)acos(vp.dot(ve)); float num = (num1 /(float) sin(phi)); float theta = num /(float) (2 * M_PI); if (theta < 0 || theta == NAN) {theta = 0;} if (vn.cross(ve).dot(vp) > 0) { u = theta; } else { u = 1 - theta; } int x = (u * IMAGE_WIDTH) -1; int y = (v * IMAGE_WIDTH) -1; int p = (y * IMAGE_WIDTH + x)*3; return Color(TEXT_DATA[p+2],TEXT_DATA[p+1],TEXT_DATA[p]); } else { return object->color; } }; I call the colour code here in Trace: if (object->materialType == MATTE) return getColor(object, ray, &t); Ray shadowRay; int isInShadow = 0; shadowRay.origin.x = pHit.x + nHit.x * bias; shadowRay.origin.y = pHit.y + nHit.y * bias; shadowRay.origin.z = pHit.z + nHit.z * bias; shadowRay.direction = light->object->center - pHit; float len = shadowRay.direction.length(); Normalize(&shadowRay.direction); float LdotN = shadowRay.direction.dot(nHit); if (LdotN < 0) return 0; Color lightColor = light->object->color; for (int k = 0; k < numObjects; k++) { if (Intersect(objects[k], &shadowRay, &t) && !objects[k]->isLight) { if (objects[k]->materialType == GLASS) lightColor *= getColor(objects[k], &shadowRay, &t); // attenuate light color by glass color else isInShadow = 1; break; } } lightColor *= 1.f/(len*len); return (isInShadow) ? 0 : getColor(object, &shadowRay, &t) * lightColor * LdotN; } I left out the rest of the code as to not bog down the post, but it can be seen here. Any help is greatly appreciated. The only portion not included in the code, is where I define the texture data, which as I said, is simply taken straight from a bitmap file of the above image. Thanks.

    Read the article

  • How do I prove I should put a table of values in source code instead of a database table?

    - by FastAl
    <tldr>looking for a reference to a book or other undeniably authoritative source that gives reasons when you should choose a database vs. when you should choose other storage methods. I have provided an un-authoritative list of reasons about 2/3 of the way down this post.</tldr> I have a situation at my company where a database is being used where it would be better to use another solution (in this case, an auto-generated piece of source code that contains a static lookup table, searched by binary sort). Normally, a database would be an OK solution even though the problem does not require a database, e.g, none of the elements of ACID are needed, as it is read-only data, updated about every 3-5 years (also requiring other sourcecode changes), and fits in memory, and can be keyed into via binary search (a tad faster than db, but speed is not an issue). The problem is that this code runs on our enterprise server, but is shared with several PC platforms (some disconnected, some use a central DB, etc.), and parts of it are managed by multiple programming units, parts by the DBAs, parts even by mathematicians in another department, etc. These hit their own platform’s version of their databases (containing their own copy of the static data). What happens is that every implementation, every little change, something different goes wrong. There are many other issues as well. I can’t even use a flatfile, because one mode of running on our enterprise server does not have permission to read files (only databases, and of course, its own literal storage, e.g., in-source table). Of course, other parts of the system use databases in proper, less obscure manners; there is no problem with those parts. So why don’t we just change it? I don’t have administrative ability to force a change. But I’m affected because sometimes I have to help fix the problems, but mostly because it causes outages and tons of extra IT time by other programmers and d*mmit that makes me mad! The reason neither management, nor the designers of the system, can see the problem is that they propose a solution that won’t work: increase communication; implement more safeguards and standards; etc. But every time, in a different part of the already-pared-down but still multi-step processes, a few different diligent, hard-working, top performing IT personnel make a unique subtle error that causes it to fail, sometimes after the last round of testing! And in general these are not single-person failures, but understandable miscommunications. And communication at our company is actually better than most. People just don't think that's the case because they haven't dug into the matter. However, I have it on very good word from somebody with extensive formal study of sociology and psychology that the relatively small amount of less-than-proper database usage in this gigantic cross-platform multi-source, multi-language project is bureaucratically un-maintainable. Impossible. No chance. At least with Human Beings in the loop, and it can’t be automated. In addition, the management and developers who could change this, though intelligent and capable, don’t understand the rigidity of this ‘how humans are’ issue, and are not convincible on the matter. The reason putting the static data in sourcecode will solve the problem is, although the solution is less sexy than a database, it would function with no technical drawbacks; and since the sharing of sourcecode already works very well, you basically erase any database-related effort from this section of the project, along with all the drawbacks of it that are causing problems. OK, that’s the background, for the curious. I won’t be able to convince management that this is an unfixable sociological problem, and that the real solution is coding around these limits of human nature, just as you would code around a bug in a 3rd party component that you can’t change. So what I have to do is exploit the unsuitableness of the database solution, and not do it using logic, but rather authority. I am aware of many reasons, and posts on this site giving reasons for one over the other; I’m not looking for lists of reasons like these (although you can add a comment if I've miss a doozy): WHY USE A DATABASE? instead of flatfile/other DB vs. file: if you need... Random Read / Transparent search optimization Advanced / varied / customizable Searching and sorting capabilities Transaction/rollback Locks, semaphores Concurrency control / Shared users Security 1-many/m-m is easier Easy modification Scalability Load Balancing Random updates / inserts / deletes Advanced query Administrative control of design, etc. SQL / learning curve Debugging / Logging Centralized / Live Backup capabilities Cached queries / dvlp & cache execution plans Interleaved update/read Referential integrity, avoid redundant/missing/corrupt/out-of-sync data Reporting (from on olap or oltp db) / turnkey generation tools [Disadvantages:] Important to get right the first time - professional design - but only b/c it's meant to last s/w & h/w cost Usu. over a network, speed issue (best vs. best design vs. local=even then a separate process req's marshalling/netwk layers/inter-p comm) indicies and query processing can stand in the way of simple processing (vs. flatfile) WHY USE FLATFILE: If you only need... Sequential Row processing only Limited usage append only (no reading, no master key/update) Only Update the record you're reading (fixed length recs only) Too big to fit into memory If Local disk / read-ahead network connection Portability / small system Email / cut & Paste / store as document by novice - simple format Low design learning curve but high cost later WHY USE IN-MEMORY/TABLE (tables, arrays, etc.): if you need... Processing a single db/ff record that was imported Known size of data Static data if hardcoding the table Narrow, unchanging use (e.g., one program or proc) -includes a class that will be shared, but encapsulates its data manipulation Extreme speed needed / high transaction frequency Random access - but search is dependent on implementation Following are some other posts about the topic: http://stackoverflow.com/questions/1499239/database-vs-flat-text-file-what-are-some-technical-reasons-for-choosing-one-over http://stackoverflow.com/questions/332825/are-flat-file-databases-any-good http://stackoverflow.com/questions/2356851/database-vs-flat-files http://stackoverflow.com/questions/514455/databases-vs-plain-text/514530 What I’d like to know is if anybody could recommend a hard, authoritative source containing these reasons. I’m looking for a paper book I can buy, or a reputable website with whitepapers about the issue (e.g., Microsoft, IBM), not counting the user-generated content on those sites. This will have a greater change to elicit a change that I’m looking for: less wasted programmer time, and more reliable programs. Thanks very much for your help. You win a prize for reading such a large post!

    Read the article

  • A new method of supporting FOSS?

    - by James
    I have been kicking an idea around for sometime and wondered if something of it's nature hadn't already been invented. The premise is a website that integrates code management, project/team management, and micro-transactions. Donations, in and of themselves, are a sporadic, and unreliable method of supporting developers. Furthermore most free software that accepts donations is started by programmers ,be it to learn, because of a hobby, or because they saw a niche that needed to be filled. There is no method in place of of saying "hay, the FOSS community needs this kind of software, will someone develop it, and accept donations!?" Programmers should be programming, not busy begging for money. Basically the idea is people can go to the site in question, and start a project or make a request. Anyone signed up with the site can start a request. Each member account is free to support or "upvote" a project request. Requests and the associated number of votes let programmers in the community know the needs of the community. When a project is started a request for developers can be put forth. Developers have a ranking based on commits to other projects. The project founder can send invites to known Developers, or accept invites from members based on developer ranking. Once the project has at least one team-member, an objectives sheet or "draft" can be put out, listing design, goals, and features. The founding member and each team-member may contribute to this sheet. Each "milestone", or "Feature" is represented by an article. An article is any unit of a draft that can be voted on by The Project Founder, Team-members, and contributors...which brings me to the next half of this idea. --Microtransactions-- People signed up with this hypothetical website can purchase credits which then can be transfered to projects they would like to support. Anyone who transfers credits to a project is known as a contributor to that project. At anytime a Founder, or the lead team-member may submit an article, or a design (multiple articles) for consideration. All team-members, as well as the Founder, can vote once for each article freely. Contributors may vote yes or no on a number of articles (independent of any given meeting where a particular design or article is considered) equal to the number of credits they have placed into a contributors fund for that particular project. A contributors fund is a proxy between a sites credit account, and a projects credit account. It is sort of like a promise to contribute, instead of an actual contribution. Contributers may place constraints on particular articles such that if those constraints (a yes or no vote) are satisfied then a manually specified amount of credits is automatically transfered to the project account. This allows a project to develop based on the needs of those who may (in the future) financially rely on the project. --- Code commits & milestones --- When a team-member makes a commit, they may specify if it's a minor commit, a bug fix, a compatibility patch (i.e. for a new platform), or a milestone (an article voted on previously). People signed up with the website, may download the updated project and test it to see if the programmer's assertion is true about the commit. A report may then be filed on a small form, giving a one or two paragraphs, and a positive or negative confirmation of the programmer's goal for that particular commit. After all milestones for a particular draft are complete, a new draft is submitted for voting. Also funds may withdrawn by each team-member based on the proportion of commits and milestones confirmed (fulfilled the stated purpose) for each programmer. --- voting --- Members, contributor, and non-contributor, may make priority requests for particular articles of a draft. The project founder may or may not opt to fill those requests based on the volume of upvotes. A fulfilled priority request means that any team-member that makes a community-confirmed commit for an article is, when all articles for the draft are fulfilled, granted a portion of project credits in proportion to the average priority of all the articles he committed. ---- Notes --- While this is horribly prone to design-by-committee the one saving grace is that the lead team-member may place constraints on a draft such that some, or ALL articles must be voted yes. Commits may not begin until a draft satisfying said constraints is approved. What does SO think, is this idea feasible? Does anyone see major problems with this? Is there any insights, or improvements that could be made?

    Read the article

  • Watching setTimeout loops so that only one is running at a time.

    - by DA
    I'm creating a content rotator in jQuery. 5 items total. Item 1 fades in, pauses 10 seconds, fades out, then item 2 fades in. Repeat. Simple enough. Using setTimeout I can call a set of functions that create a loop and will repeat the process indefinitely. I now want to add the ability to interrupt this rotator at any time by clicking on a navigation element to jump directly to one of the content items. I originally started going down the path of pinging a variable constantly (say every half second) that would check to see if a navigation element was clicked and, if so, abandon the loop, then restart the loop based on the item that was clicked. The challenge I ran into was how to actually ping a variable via a timer. The solution is to dive into JavaScript closures...which are a little over my head but definitely something I need to delve into more. However, in the process of that, I came up with an alternative option that actually seems to be better performance-wise (theoretically, at least). I have a sample running here: http://jsbin.com/uxupi/14 (It's using console.log so have fireBug running) Sample script: $(document).ready(function(){ var loopCount = 0; $('p#hello').click(function(){ loopCount++; doThatThing(loopCount); }) function doThatOtherThing(currentLoopCount) { console.log('doThatOtherThing-'+currentLoopCount); if(currentLoopCount==loopCount){ setTimeout(function(){doThatThing(currentLoopCount)},5000) } } function doThatThing(currentLoopCount) { console.log('doThatThing-'+currentLoopCount); if(currentLoopCount==loopCount){ setTimeout(function(){doThatOtherThing(currentLoopCount)},5000); } } }) The logic being that every click of the trigger element will kick off the loop passing into itself a variable equal to the current value of the global variable. That variable gets passed back and forth between the functions in the loop. Each click of the trigger also increments the global variable so that subsequent calls of the loop have a unique local variable. Then, within the loop, before the next step of each loop is called, it checks to see if the variable it has still matches the global variable. If not, it knows that a new loop has already been activated so it just ends the existing loop. Thoughts on this? Valid solution? Better options? Caveats? Dangers? UPDATE: I'm using John's suggestion below via the clearTimeout option. However, I can't quite get it to work. The logic is as such: var slideNumber = 0; var timeout = null; function startLoop(slideNumber) { ...do stuff here to set up the slide based on slideNumber... slideFadeIn() } function continueCheck(){ if (timeout != null) { // cancel the scheduled task. clearTimeout(timeout); timeout = null; return false; }else{ return true; } }; function slideFadeIn() { if (continueCheck){ // a new loop hasn't been called yet so proceed... // fade in the LI $currentListItem.fadeIn(fade, function() { if(multipleFeatures){ timeout = setTimeout(slideFadeOut,display); } }); }; function slideFadeOut() { if (continueLoop){ // a new loop hasn't been called yet so proceed... slideNumber=slideNumber+1; if(slideNumber==features.length) { slideNumber = 0; }; timeout = setTimeout(function(){startLoop(slideNumber)},100); }; startLoop(slideNumber); The above kicks of the looping. I then have navigation items that, when clicked, I want the above loop to stop, then restart with a new beginning slide: $(myNav).click(function(){ clearTimeout(timeout); timeout = null; startLoop(thisItem); }) If I comment out 'startLoop...' from the click event, it, indeed, stops the initial loop. However, if I leave that last line in, it doesn't actually stop the initial loop. Why? What happens is that both loops seem to run in parallel for a period. So, when I click my navigation, clearTimeout is called, which clears it.

    Read the article

  • How can I map "insert='false' update='false'" on a composite-id key-property which is also used in a one-to-many FK?

    - by Gweebz
    I am working on a legacy code base with an existing DB schema. The existing code uses SQL and PL/SQL to execute queries on the DB. We have been tasked with making a small part of the project database-engine agnostic (at first, change everything eventually). We have chosen to use Hibernate 3.3.2.GA and "*.hbm.xml" mapping files (as opposed to annotations). Unfortunately, it is not feasible to change the existing schema because we cannot regress any legacy features. The problem I am encountering is when I am trying to map a uni-directional, one-to-many relationship where the FK is also part of a composite PK. Here are the classes and mapping file... CompanyEntity.java public class CompanyEntity { private Integer id; private Set<CompanyNameEntity> names; ... } CompanyNameEntity.java public class CompanyNameEntity implements Serializable { private Integer id; private String languageId; private String name; ... } CompanyNameEntity.hbm.xml <?xml version="1.0"?> <!DOCTYPE hibernate-mapping PUBLIC "-//Hibernate/Hibernate Mapping DTD 3.0//EN" "http://www.jboss.org/dtd/hibernate/hibernate-mapping-3.0.dtd"> <hibernate-mapping package="com.example"> <class name="com.example.CompanyEntity" table="COMPANY"> <id name="id" column="COMPANY_ID"/> <set name="names" table="COMPANY_NAME" cascade="all-delete-orphan" fetch="join" batch-size="1" lazy="false"> <key column="COMPANY_ID"/> <one-to-many entity-name="vendorName"/> </set> </class> <class entity-name="companyName" name="com.example.CompanyNameEntity" table="COMPANY_NAME"> <composite-id> <key-property name="id" column="COMPANY_ID"/> <key-property name="languageId" column="LANGUAGE_ID"/> </composite-id> <property name="name" column="NAME" length="255"/> </class> </hibernate-mapping> This code works just fine for SELECT and INSERT of a Company with names. I encountered a problem when I tried to update and existing record. I received a BatchUpdateException and after looking through the SQL logs I saw Hibernate was trying to do something stupid... update COMPANY_NAME set COMPANY_ID=null where COMPANY_ID=? Hibernate was trying to dis-associate child records before updating them. The problem is that this field is part of the PK and not-nullable. I found the quick solution to make Hibernate not do this is to add "not-null='true'" to the "key" element in the parent mapping. SO now may mapping looks like this... CompanyNameEntity.hbm.xml <?xml version="1.0"?> <!DOCTYPE hibernate-mapping PUBLIC "-//Hibernate/Hibernate Mapping DTD 3.0//EN" "http://www.jboss.org/dtd/hibernate/hibernate-mapping-3.0.dtd"> <hibernate-mapping package="com.example"> <class name="com.example.CompanyEntity" table="COMPANY"> <id name="id" column="COMPANY_ID"/> <set name="names" table="COMPANY_NAME" cascade="all-delete-orphan" fetch="join" batch-size="1" lazy="false"> <key column="COMPANY_ID" not-null="true"/> <one-to-many entity-name="vendorName"/> </set> </class> <class entity-name="companyName" name="com.example.CompanyNameEntity" table="COMPANY_NAME"> <composite-id> <key-property name="id" column="COMPANY_ID"/> <key-property name="languageId" column="LANGUAGE_ID"/> </composite-id> <property name="name" column="NAME" length="255"/> </class> </hibernate-mapping> This mapping gives the exception... org.hibernate.MappingException: Repeated column in mapping for entity: companyName column: COMPANY_ID (should be mapped with insert="false" update="false") My problem now is that I have tryed to add these attributes to the key-property element but that is not supported by the DTD. I have also tryed changing it to a key-many-to-one element but that didn't work either. So... How can I map "insert='false' update='false'" on a composite-id key-property which is also used in a one-to-many FK?

    Read the article

  • template class: ctor against function -> new C++ standard

    - by Oops
    Hi in this question: http://stackoverflow.com/questions/2779155/template-point2-double-point3-double Dennis and Michael noticed the unreasonable foolishly implemented constructor. They were right, I didn't consider this at that moment. But I found out that a constructor does not help very much for a template class like this one, instead a function is here much more convenient and safe namespace point { template < unsigned int dims, typename T > struct Point { T X[ dims ]; std::string str() { std::stringstream s; s << "{"; for ( int i = 0; i < dims; ++i ) { s << " X" << i << ": " << X[ i ] << (( i < dims -1 )? " |": " "); } s << "}"; return s.str(); } Point<dims, int> toint() { Point<dims, int> ret; std::copy( X, X+dims, ret.X ); return ret; } }; template < typename T > Point< 2, T > Create( T X0, T X1 ) { Point< 2, T > ret; ret.X[ 0 ] = X0; ret.X[ 1 ] = X1; return ret; } template < typename T > Point< 3, T > Create( T X0, T X1, T X2 ) { Point< 3, T > ret; ret.X[ 0 ] = X0; ret.X[ 1 ] = X1; ret.X[ 2 ] = X2; return ret; } template < typename T > Point< 4, T > Create( T X0, T X1, T X2, T X3 ) { Point< 4, T > ret; ret.X[ 0 ] = X0; ret.X[ 1 ] = X1; ret.X[ 2 ] = X2; ret.X[ 3 ] = X3; return ret; } }; int main( void ) { using namespace point; Point< 2, double > p2d = point::Create( 12.3, 34.5 ); Point< 3, double > p3d = point::Create( 12.3, 34.5, 56.7 ); Point< 4, double > p4d = point::Create( 12.3, 34.5, 56.7, 78.9 ); //Point< 3, double > p1d = point::Create( 12.3, 34.5 ); //no suitable user defined conversion exists //Point< 3, int > p1i = p4d.toint(); //no suitable user defined conversion exists Point< 2, int > p2i = p2d.toint(); Point< 3, int > p3i = p3d.toint(); Point< 4, int > p4i = p4d.toint(); std::cout << p2d.str() << std::endl; std::cout << p3d.str() << std::endl; std::cout << p4d.str() << std::endl; std::cout << p2i.str() << std::endl; std::cout << p3i.str() << std::endl; std::cout << p4i.str() << std::endl; char c; std::cin >> c; } has the new C++ standard any new improvements, language features or simplifications regarding this aspect of ctor of a template class? what do you think about the implementation of the combination of namespace, stuct and Create function? many thanks in advance Oops

    Read the article

  • $.post is not working

    - by BEBO
    i am trying to post data to Mysql using jquery $.post and php page. my code is not running and nothing is added to the mysql table. I am not sure if the path i am creating is wrong but any help would be appreciated. Jquery location: f_js/tasks/TaskTest.js <script type="text/javascript"> $(document).ready(function(){ $("#AddTask").click(function(){ var acct = $('#acct').val(); var quicktask = $('#quicktask').val(); var user = $('#user').val(); $.post('addTask.php',{acct:acct,quicktask:quicktask,user:user}, function(data){ $('#result').fadeIn('slow').html(data); }); }); }); </script> addTask.php (runs the jqeury code) <?php include 'dbconnect.php'; include 'sessions.php'; $acct = $_POST['acct']; $task = $_POST['quicktask']; $taskstatus = 'Active'; //get task Creator $user = $_POST['user']; //query task creator from users table $allusers = mysql_query("SELECT * FROM users WHERE username = '$user'"); while ($rows = mysql_fetch_array($allusers)) { //get first and last name for task creator $taskOwner = $rows['user_firstname']; $taskOwnerLast = $rows['user_lastname']; $taskOwnerFull = $taskOwner." ".$taskOwnerLast; mysql_query("INSERT INTO tasks (taskresource, tasktitle, taskdetail, taskstatus, taskowner, taskOwnerFullName) VALUES ('$acct', '$task', '$task', '$taskstatus', '$user', '$taskOwnerFull' )"); echo "inserted"; } ?> Accountview.php finally the front page <html> <div class="input-cont "> <input type="text" class="form-control col-lg-12" placeholder="Add a quick Task..." name ="quicktask" id="quicktask"> </div> <div class="form-group"> <div class="pull-right chat-features"> <a href="javascript:;"> <i class="icon-camera"></i> </a> <a href="javascript:;"> <i class="icon-link"></i> </a> <input type="button" class="btn btn-danger" name="AddTask" id="AddTask" value="Add" /> <input type="hidden" name="acct" id="acct" value="<?php echo $_REQUEST['acctname']?>"/> <input type="hidden" name="user" id="user" value="<?php $username = $_SESSION['username']; echo $username?>"/> <div id="result">result</div> </div> </div> <!-- js placed at the end of the document so the pages load faster --> <script src="js/jquery.js"></script> <script src="f_js/tasks/TaskTest.js"></script> <!--common script for all pages--> <script src="js/common-scripts.js"></script> <script type="text/javascript" src="assets/gritter/js/jquery.gritter.js"></script> <script src="js/gritter.js" type="text/javascript"></script> <script> </html> Firebug reponse: Response Headers Connection Keep-Alive Content-Length 0 Content-Type text/html Date Fri, 08 Nov 2013 21:48:50 GMT Keep-Alive timeout=5, max=100 Server Apache/2.4.4 (Win32) OpenSSL/0.9.8y PHP/5.4.16 X-Powered-By PHP/5.4.16 refresh 5; URL=index.php Request Headers Accept */* Accept-Encoding gzip, deflate Accept-Language en-US,en;q=0.5 Content-Length 13 Content-Type application/x-www-form-urlencoded; charset=UTF-8 Cookie PHPSESSID=6gufl3guiiddreg8cdlc0htnc6 Host localhost Referer http://localhost/betahtml/AccountView.php?acctname=client%201 User-Agent Mozilla/5.0 (Windows NT 6.3; WOW64; rv:25.0) Gecko/20100101 Firefox/25.0 X-Requested-With XMLHttpRequest

    Read the article

  • Problems installing Memcache (PECL extension)

    - by Petrus
    I have installed memcached fine, and now I will need to install PECL extension memcache. Im running RedHat x86_64 es5. The installation gives me this: downloading memcache-2.2.6.tgz ... Starting to download memcache-2.2.6.tgz (35,957 bytes) ..........done: 35,957 bytes 11 source files, building running: phpize Configuring for: PHP Api Version: 20090626 Zend Module Api No: 20090626 Zend Extension Api No: 220090626 Enable memcache session handler support? [yes] : Notice: Use of undefined constant STDIN - assumed 'STDIN' in PEAR/Frontend/CLI.php on line 304 Warning: fgets() expects parameter 1 to be resource, string given in PEAR/Frontend/CLI.php on line 304 Warning: fgets() expects parameter 1 to be resource, string given in /usr/lib/php/PEAR/Frontend/CLI.php on line 304 building in /root/tmp/pear-build-root/memcache-2.2.6 running: /root/tmp/pear/memcache/configure --enable-memcache-session=yes checking for egrep... grep -E checking for a sed that does not truncate output... /bin/sed checking for cc... cc checking for C compiler default output file name... a.out checking whether the C compiler works... yes checking whether we are cross compiling... no checking for suffix of executables... checking for suffix of object files... o checking whether we are using the GNU C compiler... yes checking whether cc accepts -g... yes checking for cc option to accept ANSI C... none needed checking how to run the C preprocessor... cc -E checking for icc... no checking for suncc... no checking whether cc understands -c and -o together... yes checking for system library directory... lib checking if compiler supports -R... no checking if compiler supports -Wl,-rpath,... yes checking build system type... x86_64-unknown-linux-gnu checking host system type... x86_64-unknown-linux-gnu checking target system type... x86_64-unknown-linux-gnu checking for PHP prefix... /usr checking for PHP includes... -I/usr/include/php -I/usr/include/php/main -I/usr/include/php/TSRM -I/usr/include/php/Zend -I/usr/include/php/ext -I/usr/include/php/ext/date/lib checking for PHP extension directory... /usr/lib/php/extensions/no-debug-non-zts-20090626 checking for PHP installed headers prefix... /usr/include/php checking if debug is enabled... no checking if zts is enabled... no checking for re2c... re2c checking for re2c version... invalid configure: WARNING: You will need re2c 0.13.4 or later if you want to regenerate PHP parsers. checking for gawk... gawk checking whether to enable memcache support... yes, shared checking whether to enable memcache session handler support... yes checking for the location of ZLIB... no checking for the location of zlib... /usr checking for session includes... /usr/include/php checking for memcache session support... enabled checking for ld used by cc... /usr/bin/ld checking if the linker (/usr/bin/ld) is GNU ld... yes checking for /usr/bin/ld option to reload object files... -r checking for BSD-compatible nm... /usr/bin/nm -B checking whether ln -s works... yes checking how to recognize dependent libraries... pass_all checking for ANSI C header files... yes checking for sys/types.h... yes checking for sys/stat.h... yes checking for stdlib.h... yes checking for string.h... yes checking for memory.h... yes checking for strings.h... yes checking for inttypes.h... yes checking for stdint.h... yes checking for unistd.h... yes checking dlfcn.h usability... yes checking dlfcn.h presence... yes checking for dlfcn.h... yes checking the maximum length of command line arguments... 98304 checking command to parse /usr/bin/nm -B output from cc object... ok checking for objdir... .libs checking for ar... ar checking for ranlib... ranlib checking for strip... strip checking if cc supports -fno-rtti -fno-exceptions... no checking for cc option to produce PIC... -fPIC checking if cc PIC flag -fPIC works... yes checking if cc static flag -static works... yes checking if cc supports -c -o file.o... yes checking whether the cc linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes checking whether -lc should be explicitly linked in... no checking dynamic linker characteristics... GNU/Linux ld.so checking how to hardcode library paths into programs... immediate checking whether stripping libraries is possible... yes checking if libtool supports shared libraries... yes checking whether to build shared libraries... yes checking whether to build static libraries... no creating libtool appending configuration tag "CXX" to libtool configure: creating ./config.status config.status: creating config.h running: make /bin/sh /root/tmp/pear-build-root/memcache-2.2.6/libtool --mode=compile cc -I/usr/include/php -I. -I/root/tmp/pear/memcache -DPHP_ATOM_INC -I/root/tmp/pear-build-root/memcache-2.2.6/include -I/root/tmp/pear-build-root/memcache-2.2.6/main -I/root/tmp/pear/memcache -I/usr/include/php -I/usr/include/php/main -I/usr/include/php/TSRM -I/usr/include/php/Zend -I/usr/include/php/ext -I/usr/include/php/ext/date/lib -DHAVE_CONFIG_H -g -O2 -c /root/tmp/pear/memcache/memcache.c -o memcache.lo mkdir .libs cc -I/usr/include/php -I. -I/root/tmp/pear/memcache -DPHP_ATOM_INC -I/root/tmp/pear-build-root/memcache-2.2.6/include -I/root/tmp/pear-build-root/memcache-2.2.6/main -I/root/tmp/pear/memcache -I/usr/include/php -I/usr/include/php/main -I/usr/include/php/TSRM -I/usr/include/php/Zend -I/usr/include/php/ext -I/usr/include/php/ext/date/lib -DHAVE_CONFIG_H -g -O2 -c /root/tmp/pear/memcache/memcache.c -fPIC -DPIC -o .libs/memcache.o /bin/sh /root/tmp/pear-build-root/memcache-2.2.6/libtool --mode=compile cc -I/usr/include/php -I. -I/root/tmp/pear/memcache -DPHP_ATOM_INC -I/root/tmp/pear-build-root/memcache-2.2.6/include -I/root/tmp/pear-build-root/memcache-2.2.6/main -I/root/tmp/pear/memcache -I/usr/include/php -I/usr/include/php/main -I/usr/include/php/TSRM -I/usr/include/php/Zend -I/usr/include/php/ext -I/usr/include/php/ext/date/lib -DHAVE_CONFIG_H -g -O2 -c /root/tmp/pear/memcache/memcache_queue.c -o memcache_queue.lo cc -I/usr/include/php -I. -I/root/tmp/pear/memcache -DPHP_ATOM_INC -I/root/tmp/pear-build-root/memcache-2.2.6/include -I/root/tmp/pear-build-root/memcache-2.2.6/main -I/root/tmp/pear/memcache -I/usr/include/php -I/usr/include/php/main -I/usr/include/php/TSRM -I/usr/include/php/Zend -I/usr/include/php/ext -I/usr/include/php/ext/date/lib -DHAVE_CONFIG_H -g -O2 -c /root/tmp/pear/memcache/memcache_queue.c -fPIC -DPIC -o .libs/memcache_queue.o /bin/sh /root/tmp/pear-build-root/memcache-2.2.6/libtool --mode=compile cc -I/usr/include/php -I. -I/root/tmp/pear/memcache -DPHP_ATOM_INC -I/root/tmp/pear-build-root/memcache-2.2.6/include -I/root/tmp/pear-build-root/memcache-2.2.6/main -I/root/tmp/pear/memcache -I/usr/include/php -I/usr/include/php/main -I/usr/include/php/TSRM -I/usr/include/php/Zend -I/usr/include/php/ext -I/usr/include/php/ext/date/lib -DHAVE_CONFIG_H -g -O2 -c /root/tmp/pear/memcache/memcache_standard_hash.c -o memcache_standard_hash.lo cc -I/usr/include/php -I. -I/root/tmp/pear/memcache -DPHP_ATOM_INC -I/root/tmp/pear-build-root/memcache-2.2.6/include -I/root/tmp/pear-build-root/memcache-2.2.6/main -I/root/tmp/pear/memcache -I/usr/include/php -I/usr/include/php/main -I/usr/include/php/TSRM -I/usr/include/php/Zend -I/usr/include/php/ext -I/usr/include/php/ext/date/lib -DHAVE_CONFIG_H -g -O2 -c /root/tmp/pear/memcache/memcache_standard_hash.c -fPIC -DPIC -o .libs/memcache_standard_hash.o /bin/sh /root/tmp/pear-build-root/memcache-2.2.6/libtool --mode=compile cc -I/usr/include/php -I. -I/root/tmp/pear/memcache -DPHP_ATOM_INC -I/root/tmp/pear-build-root/memcache-2.2.6/include -I/root/tmp/pear-build-root/memcache-2.2.6/main -I/root/tmp/pear/memcache -I/usr/include/php -I/usr/include/php/main -I/usr/include/php/TSRM -I/usr/include/php/Zend -I/usr/include/php/ext -I/usr/include/php/ext/date/lib -DHAVE_CONFIG_H -g -O2 -c /root/tmp/pear/memcache/memcache_consistent_hash.c -o memcache_consistent_hash.lo cc -I/usr/include/php -I. -I/root/tmp/pear/memcache -DPHP_ATOM_INC -I/root/tmp/pear-build-root/memcache-2.2.6/include -I/root/tmp/pear-build-root/memcache-2.2.6/main -I/root/tmp/pear/memcache -I/usr/include/php -I/usr/include/php/main -I/usr/include/php/TSRM -I/usr/include/php/Zend -I/usr/include/php/ext -I/usr/include/php/ext/date/lib -DHAVE_CONFIG_H -g -O2 -c /root/tmp/pear/memcache/memcache_consistent_hash.c -fPIC -DPIC -o .libs/memcache_consistent_hash.o /bin/sh /root/tmp/pear-build-root/memcache-2.2.6/libtool --mode=compile cc -I/usr/include/php -I. -I/root/tmp/pear/memcache -DPHP_ATOM_INC -I/root/tmp/pear-build-root/memcache-2.2.6/include -I/root/tmp/pear-build-root/memcache-2.2.6/main -I/root/tmp/pear/memcache -I/usr/include/php -I/usr/include/php/main -I/usr/include/php/TSRM -I/usr/include/php/Zend -I/usr/include/php/ext -I/usr/include/php/ext/date/lib -DHAVE_CONFIG_H -g -O2 -c /root/tmp/pear/memcache/memcache_session.c -o memcache_session.lo cc -I/usr/include/php -I. -I/root/tmp/pear/memcache -DPHP_ATOM_INC -I/root/tmp/pear-build-root/memcache-2.2.6/include -I/root/tmp/pear-build-root/memcache-2.2.6/main -I/root/tmp/pear/memcache -I/usr/include/php -I/usr/include/php/main -I/usr/include/php/TSRM -I/usr/include/php/Zend -I/usr/include/php/ext -I/usr/include/php/ext/date/lib -DHAVE_CONFIG_H -g -O2 -c /root/tmp/pear/memcache/memcache_session.c -fPIC -DPIC -o .libs/memcache_session.o /bin/sh /root/tmp/pear-build-root/memcache-2.2.6/libtool --mode=link cc -DPHP_ATOM_INC -I/root/tmp/pear-build-root/memcache-2.2.6/include -I/root/tmp/pear-build-root/memcache-2.2.6/main -I/root/tmp/pear/memcache -I/usr/include/php -I/usr/include/php/main -I/usr/include/php/TSRM -I/usr/include/php/Zend -I/usr/include/php/ext -I/usr/include/php/ext/date/lib -DHAVE_CONFIG_H -g -O2 -o memcache.la -export-dynamic -avoid-version -prefer-pic -module -rpath /root/tmp/pear-build-root/memcache-2.2.6/modules memcache.lo memcache_queue.lo memcache_standard_hash.lo memcache_consistent_hash.lo memcache_session.lo cc -shared .libs/memcache.o .libs/memcache_queue.o .libs/memcache_standard_hash.o .libs/memcache_consistent_hash.o .libs/memcache_session.o -Wl,-soname -Wl,memcache.so -o .libs/memcache.so creating memcache.la (cd .libs && rm -f memcache.la && ln -s ../memcache.la memcache.la) /bin/sh /root/tmp/pear-build-root/memcache-2.2.6/libtool --mode=install cp ./memcache.la /root/tmp/pear-build-root/memcache-2.2.6/modules cp ./.libs/memcache.so /root/tmp/pear-build-root/memcache-2.2.6/modules/memcache.so cp ./.libs/memcache.lai /root/tmp/pear-build-root/memcache-2.2.6/modules/memcache.la PATH="$PATH:/sbin" ldconfig -n /root/tmp/pear-build-root/memcache-2.2.6/modules ---------------------------------------------------------------------- Libraries have been installed in: /root/tmp/pear-build-root/memcache-2.2.6/modules If you ever happen to want to link against installed libraries in a given directory, LIBDIR, you must either use libtool, and specify the full pathname of the library, or use the `-LLIBDIR' flag during linking and do at least one of the following: - add LIBDIR to the `LD_LIBRARY_PATH' environment variable during execution - add LIBDIR to the `LD_RUN_PATH' environment variable during linking - use the `-Wl,--rpath -Wl,LIBDIR' linker flag - have your system administrator add LIBDIR to `/etc/ld.so.conf' See any operating system documentation about shared libraries for more information, such as the ld(1) and ld.so(8) manual pages. ---------------------------------------------------------------------- Build complete. Don't forget to run 'make test'. running: make INSTALL_ROOT="/root/tmp/pear-build-root/install-memcache-2.2.6" install Installing shared extensions: /root/tmp/pear-build-root/install-memcache-2.2.6/usr/lib/php/extensions/no-debug-non-zts-20090626/ running: find "/root/tmp/pear-build-root/install-memcache-2.2.6" | xargs ls -dils 361232 4 drwxr-xr-x 3 root root 4096 Jan 28 10:47 /root/tmp/pear-build-root/install-memcache-2.2.6 361263 4 drwxr-xr-x 3 root root 4096 Jan 28 10:47 /root/tmp/pear-build-root/install-memcache-2.2.6/usr 361264 4 drwxr-xr-x 3 root root 4096 Jan 28 10:47 /root/tmp/pear-build-root/install-memcache-2.2.6/usr/lib 361265 4 drwxr-xr-x 3 root root 4096 Jan 28 10:47 /root/tmp/pear-build-root/install-memcache-2.2.6/usr/lib/php 361266 4 drwxr-xr-x 3 root root 4096 Jan 28 10:47 /root/tmp/pear-build-root/install-memcache-2.2.6/usr/lib/php/extensions 361267 4 drwxr-xr-x 2 root root 4096 Jan 28 10:47 /root/tmp/pear-build-root/install-memcache-2.2.6/usr/lib/php/extensions/no-debug-non-zts-20090626 361262 236 -rwxr-xr-x 1 root root 235575 Jan 28 10:47 /root/tmp/pear-build-root/install-memcache-2.2.6/usr/lib/php/extensions/no-debug-non-zts-20090626/memcache.so Build process completed successfully Installing '/usr/lib/php/extensions/no-debug-non-zts-20090626/memcache.so' install ok: channel://pecl.php.net/memcache-2.2.6 Extension memcache enabled in php.ini The memcache.so object is not in /usr/local/lib/php/extensions/no-debug-non-zts-20090626 I tried as well to install this extension "memcached 1.0.2 (PHP extension for interfacing with memcached via libmemcached library)" but it failed: downloading memcached-1.0.2.tgz ... Starting to download memcached-1.0.2.tgz (22,724 bytes) ........done: 22,724 bytes 4 source files, building running: phpize Configuring for: PHP Api Version: 20090626 Zend Module Api No: 20090626 Zend Extension Api No: 220090626 building in /root/tmp/pear-build-root/memcached-1.0.2 running: /root/tmp/pear/memcached/configure checking for egrep... grep -E checking for a sed that does not truncate output... /bin/sed checking for cc... cc checking for C compiler default output file name... a.out checking whether the C compiler works... yes checking whether we are cross compiling... no checking for suffix of executables... checking for suffix of object files... o checking whether we are using the GNU C compiler... yes checking whether cc accepts -g... yes checking for cc option to accept ANSI C... none needed checking how to run the C preprocessor... cc -E checking for icc... no checking for suncc... no checking whether cc understands -c and -o together... yes checking for system library directory... lib checking if compiler supports -R... no checking if compiler supports -Wl,-rpath,... yes checking build system type... x86_64-unknown-linux-gnu checking host system type... x86_64-unknown-linux-gnu checking target system type... x86_64-unknown-linux-gnu checking for PHP prefix... /usr checking for PHP includes... -I/usr/include/php -I/usr/include/php/main -I/usr/include/php/TSRM -I/usr/include/php/Zend -I/usr/include/php/ext -I/usr/include/php/ext/date/lib checking for PHP extension directory... /usr/lib/php/extensions/no-debug-non-zts-20090626 checking for PHP installed headers prefix... /usr/include/php checking if debug is enabled... no checking if zts is enabled... no checking for re2c... re2c checking for re2c version... invalid configure: WARNING: You will need re2c 0.13.4 or later if you want to regenerate PHP parsers. checking for gawk... gawk checking whether to enable memcached support... yes, shared checking for libmemcached... yes, shared checking whether to enable memcached session handler support... yes checking whether to enable memcached igbinary serializer support... no checking for ZLIB... yes, shared checking for zlib location... /usr checking for session includes... /usr/include/php checking for memcached session support... enabled checking for memcached igbinary support... disabled checking for libmemcached location... configure: error: memcached support requires libmemcached. Use --with-libmemcached-dir= to specify the prefix where libmemcached headers and library are located ERROR: `/root/tmp/pear/memcached/configure' failed The memcached.so object is not in /usr/local/lib/php/extensions/no-debug-non-zts-20090626 Is there a kind soul out there that can solve this puzzle?

    Read the article

  • Apt-Get Update: failure to fetch; can't connect to any sources

    - by weberc2
    I realize there are dozens of "apt-get update: failure to fetch" questions (I read through all I could find), but my present circumstance is unique to 12.04 and it affects all sources; not just launchpad. Additionally, I've tried several different servers in Europe and the U.S. as well as the "main server" (wherever that is) and they all yield the same result: I can't connect to any software sources. Additionally, I'm fairly certain the problem stems from the upgrade from 11.10-12.04 I performed this morning, as updates worked immediately before. Updates from the Update Manager worked fine and I could download some things (mutter) from the Software Center without incident, which makes me think I can connect to some subset of the Ubuntu servers (however, several other Ubuntu servers--like extras--and some canonical servers are listed as 'unable to connect'). Here is the output from sudo apt-get update: sudo apt-get update Ign http://ftp.u-picardie.fr precise InRelease Ign http://archive.canonical.com precise InRelease Ign http://ftp.u-picardie.fr precise-updates InRelease Ign http://ftp.u-picardie.fr precise-backports InRelease Err http://ftp.u-picardie.fr precise-security InRelease Err http://ftp.u-picardie.fr precise Release.gpg Unable to connect to ftp.u-picardie.fr:http: Err http://ftp.u-picardie.fr precise-updates Release.gpg Unable to connect to ftp.u-picardie.fr:http: Err http://ftp.u-picardie.fr precise-backports Release.gpg Unable to connect to ftp.u-picardie.fr:http: Err http://ftp.u-picardie.fr precise-security Release.gpg Unable to connect to ftp.u-picardie.fr:http: Hit http://archive.canonical.com precise Release.gpg Hit http://archive.canonical.com precise Release Hit http://archive.canonical.com precise/partner i386 Packages Ign http://archive.canonical.com precise/partner TranslationIndex Ign http://dl.google.com stable InRelease Ign http://dl.google.com stable InRelease Err http://archive.canonical.com precise/partner Translation-en_US Unable to connect to archive.canonical.com:http: [IP: 91.189.92.150 80] Err http://archive.canonical.com precise/partner Translation-en Unable to connect to archive.canonical.com:http: [IP: 91.189.92.150 80] Ign http://extras.ubuntu.com precise InRelease Get:1 http://dl.google.com stable Release.gpg [198 B] Err http://extras.ubuntu.com precise Release.gpg Could not connect to extras.ubuntu.com:80 (91.189.88.33). - connect (111: Connection refused) Ign http://ppa.launchpad.net precise InRelease Err http://ppa.launchpad.net precise InRelease Err http://ppa.launchpad.net precise InRelease Err http://ppa.launchpad.net precise InRelease Err http://ppa.launchpad.net precise InRelease Err http://ppa.launchpad.net precise InRelease Err http://ppa.launchpad.net precise InRelease Err http://ppa.launchpad.net precise InRelease Err http://ppa.launchpad.net precise InRelease Get:2 http://dl.google.com stable Release.gpg [198 B] Err http://ppa.launchpad.net precise Release.gpg Unable to connect to ppa.launchpad.net:http: Err http://ppa.launchpad.net precise Release.gpg Unable to connect to ppa.launchpad.net:http: Err http://ppa.launchpad.net precise Release.gpg Unable to connect to ppa.launchpad.net:http: Err http://ppa.launchpad.net precise Release.gpg Unable to connect to ppa.launchpad.net:http: Err http://ppa.launchpad.net precise Release.gpg Unable to connect to ppa.launchpad.net:http: Err http://ppa.launchpad.net precise Release.gpg Unable to connect to ppa.launchpad.net:http: Err http://ppa.launchpad.net precise Release.gpg Unable to connect to ppa.launchpad.net:http: Err http://ppa.launchpad.net precise Release.gpg Unable to connect to ppa.launchpad.net:http: Err http://ppa.launchpad.net precise Release.gpg Unable to connect to ppa.launchpad.net:http: Get:3 http://dl.google.com stable Release [1,347 B] Get:4 http://dl.google.com stable Release [1,347 B] Get:5 http://dl.google.com stable/main i386 Packages [1,268 B] Ign http://dl.google.com stable/main TranslationIndex Get:6 http://dl.google.com stable/main i386 Packages [769 B] Ign http://dl.google.com stable/main TranslationIndex Ign http://dl.google.com stable/main Translation-en_US Ign http://dl.google.com stable/main Translation-en Ign http://dl.google.com stable/main Translation-en_US Ign http://dl.google.com stable/main Translation-en Fetched 5,127 B in 7s (673 B/s) Reading package lists... Done W: Failed to fetch http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/dists/precise-security/InRelease W: Failed to fetch http://ppa.launchpad.net/elementary-os/stable/ubuntu/dists/precise/InRelease W: Failed to fetch http://ppa.launchpad.net/elementaryart/elementary-dev/ubuntu/dists/precise/InRelease W: Failed to fetch http://ppa.launchpad.net/midori/ppa/ubuntu/dists/precise/InRelease W: Failed to fetch http://ppa.launchpad.net/nemequ/sqlheavy/ubuntu/dists/precise/InRelease W: Failed to fetch http://ppa.launchpad.net/ricotz/docky/ubuntu/dists/precise/InRelease W: Failed to fetch http://ppa.launchpad.net/sgringwe/beatbox/ubuntu/dists/precise/InRelease W: Failed to fetch http://ppa.launchpad.net/webupd8team/y-ppa-manager/ubuntu/dists/precise/InRelease W: Failed to fetch http://ppa.launchpad.net/yorba/ppa/ubuntu/dists/precise/InRelease W: Failed to fetch http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/dists/precise/Release.gpg Unable to connect to ftp.u-picardie.fr:http: W: Failed to fetch http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/dists/precise-updates/Release.gpg Unable to connect to ftp.u-picardie.fr:http: W: Failed to fetch http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/dists/precise-backports/Release.gpg Unable to connect to ftp.u-picardie.fr:http: W: Failed to fetch http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/dists/precise-security/Release.gpg Unable to connect to ftp.u-picardie.fr:http: W: Failed to fetch http://archive.canonical.com/ubuntu/dists/precise/partner/i18n/Translation-en_US Unable to connect to archive.canonical.com:http: [IP: 91.189.92.150 80] W: Failed to fetch http://archive.canonical.com/ubuntu/dists/precise/partner/i18n/Translation-en Unable to connect to archive.canonical.com:http: [IP: 91.189.92.150 80] W: Failed to fetch http://extras.ubuntu.com/ubuntu/dists/precise/Release.gpg Could not connect to extras.ubuntu.com:80 (91.189.88.33). - connect (111: Connection refused) W: Failed to fetch http://ppa.launchpad.net/caffeine-developers/ppa/ubuntu/dists/precise/Release.gpg Unable to connect to ppa.launchpad.net:http: W: Failed to fetch http://ppa.launchpad.net/elementary-os/stable/ubuntu/dists/precise/Release.gpg Unable to connect to ppa.launchpad.net:http: W: Failed to fetch http://ppa.launchpad.net/elementaryart/elementary-dev/ubuntu/dists/precise/Release.gpg Unable to connect to ppa.launchpad.net:http: W: Failed to fetch http://ppa.launchpad.net/midori/ppa/ubuntu/dists/precise/Release.gpg Unable to connect to ppa.launchpad.net:http: W: Failed to fetch http://ppa.launchpad.net/nemequ/sqlheavy/ubuntu/dists/precise/Release.gpg Unable to connect to ppa.launchpad.net:http: W: Failed to fetch http://ppa.launchpad.net/ricotz/docky/ubuntu/dists/precise/Release.gpg Unable to connect to ppa.launchpad.net:http: W: Failed to fetch http://ppa.launchpad.net/sgringwe/beatbox/ubuntu/dists/precise/Release.gpg Unable to connect to ppa.launchpad.net:http: W: Failed to fetch http://ppa.launchpad.net/webupd8team/y-ppa-manager/ubuntu/dists/precise/Release.gpg Unable to connect to ppa.launchpad.net:http: W: Failed to fetch http://ppa.launchpad.net/yorba/ppa/ubuntu/dists/precise/Release.gpg Unable to connect to ppa.launchpad.net:http: W: Some index files failed to download. They have been ignored, or old ones used instead. W: Duplicate sources.list entry http://ppa.launchpad.net/nemequ/sqlheavy/ubuntu/ precise/main i386 Packages (/var/lib/apt/lists/ppa.launchpad.net_nemequ_sqlheavy_ubuntu_dists_precise_main_binary-i386_Packages) W: Duplicate sources.list entry http://ppa.launchpad.net/sgringwe/beatbox/ubuntu/ precise/main i386 Packages (/var/lib/apt/lists/ppa.launchpad.net_sgringwe_beatbox_ubuntu_dists_precise_main_binary-i386_Packages) Contents of /etc/apt/sources.list: # deb cdrom:[Ubuntu 11.10 _Oneiric Ocelot_ - Release i386 (20111012)]/ oneiric main restricted deb-src http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/ precise main restricted #Added by software-properties # See http://help.ubuntu.com/community/UpgradeNotes for how to upgrade to # newer versions of the distribution. deb http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/ precise main restricted deb-src http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/ precise multiverse universe #Added by software-properties ## Major bug fix updates produced after the final release of the ## distribution. deb http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/ precise-updates main restricted deb-src http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/ precise-updates restricted main multiverse universe #Added by software-properties ## N.B. software from this repository is ENTIRELY UNSUPPORTED by the Ubuntu ## team. Also, please note that software in universe WILL NOT receive any ## review or updates from the Ubuntu security team. deb http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/ precise universe deb http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/ precise-updates universe ## N.B. software from this repository is ENTIRELY UNSUPPORTED by the Ubuntu ## team, and may not be under a free licence. Please satisfy yourself as to ## your rights to use the software. Also, please note that software in ## multiverse WILL NOT receive any review or updates from the Ubuntu ## security team. deb http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/ precise multiverse deb http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/ precise-updates multiverse ## N.B. software from this repository may not have been tested as ## extensively as that contained in the main release, although it includes ## newer versions of some applications which may provide useful features. ## Also, please note that software in backports WILL NOT receive any review ## or updates from the Ubuntu security team. deb http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/ precise-backports main restricted universe multiverse deb-src http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/ precise-backports main restricted universe multiverse #Added by software-properties deb http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/ precise-security main restricted deb-src http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/ precise-security restricted main multiverse universe #Added by software-properties deb http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/ precise-security universe deb http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/ precise-security multiverse ## Uncomment the following two lines to add software from Canonical's ## 'partner' repository. ## This software is not part of Ubuntu, but is offered by Canonical and the ## respective vendors as a service to Ubuntu users. # deb http://archive.canonical.com/ubuntu oneiric partner # deb-src http://archive.canonical.com/ubuntu oneiric partner ## This software is not part of Ubuntu, but is offered by third-party ## developers who want to ship their latest software. deb http://extras.ubuntu.com/ubuntu precise main deb-src http://extras.ubuntu.com/ubuntu precise main Testing Alternate sources.list file These are the steps I followed to produce the following output: Please backup your sources.list: sudo cp /etc/apt/sources.list /etc/apt/sources.list.backup and then replace the contents of /etc/apt/sources.list with the below lines and run apt-get update: deb http://archive.ubuntu.com/ubuntu/ precise main restricted universe multiverse deb http://archive.ubuntu.com/ubuntu/ precise-updates main restricted universe multiverse deb http://archive.ubuntu.com/ubuntu/ precise-backports main restricted universe multiverse deb http://security.ubuntu.com/ubuntu precise-security main restricted universe multiverse deb http://archive.canonical.com/ubuntu precise partner deb http://extras.ubuntu.com/ubuntu precise main Output: someone@someone-UBook:~$ sudo apt-get update Ign http://archive.canonical.com precise InRelease Hit http://archive.canonical.com precise Release.gpg Hit http://archive.canonical.com precise Release Ign http://archive.ubuntu.com precise InRelease Ign http://extras.ubuntu.com precise InRelease Ign http://archive.ubuntu.com precise-updates InRelease Hit http://archive.canonical.com precise/partner i386 Packages Hit http://extras.ubuntu.com precise Release.gpg Ign http://archive.ubuntu.com precise-backports InRelease Ign http://archive.canonical.com precise/partner TranslationIndex Err http://archive.canonical.com precise/partner Translation-en_US Unable to connect to archive.canonical.com:http: [IP: 91.189.92.150 80] Err http://archive.canonical.com precise/partner Translation-en Unable to connect to archive.canonical.com:http: [IP: 91.189.92.150 80] Hit http://extras.ubuntu.com precise Release Get:1 http://archive.ubuntu.com precise Release.gpg [198 B] Ign http://dl.google.com stable InRelease Err http://dl.google.com stable InRelease Err http://dl.google.com stable Release.gpg Unable to connect to dl.google.com:http: [IP: 173.194.34.38 80] Err http://dl.google.com stable Release.gpg Unable to connect to dl.google.com:http: [IP: 173.194.34.38 80] Get:2 http://archive.ubuntu.com precise-updates Release.gpg [198 B] Hit http://extras.ubuntu.com precise/main i386 Packages Get:3 http://archive.ubuntu.com precise-backports Release.gpg [198 B] Ign http://security.ubuntu.com precise-security InRelease Ign http://extras.ubuntu.com precise/main TranslationIndex Err http://extras.ubuntu.com precise/main Translation-en_US Unable to connect to extras.ubuntu.com:http: Err http://extras.ubuntu.com precise/main Translation-en Unable to connect to extras.ubuntu.com:http: Get:4 http://security.ubuntu.com precise-security Release.gpg [198 B] Get:5 http://archive.ubuntu.com precise Release [49.6 kB] Get:6 http://security.ubuntu.com precise-security Release [49.6 kB] Get:7 http://archive.ubuntu.com precise-updates Release [49.6 kB] Get:8 http://archive.ubuntu.com precise-backports Release [49.6 kB] Get:9 http://security.ubuntu.com precise-security/main i386 Packages [32.9 kB] Get:10 http://archive.ubuntu.com precise/main i386 Packages [1,274 kB] Get:11 http://security.ubuntu.com precise-security/restricted i386 Packages [14 B] Get:12 http://security.ubuntu.com precise-security/universe i386 Packages [8,594 B] Get:13 http://security.ubuntu.com precise-security/multiverse i386 Packages [1,393 B] Get:14 http://security.ubuntu.com precise-security/main TranslationIndex [73 B] Get:15 http://security.ubuntu.com precise-security/multiverse TranslationIndex [71 B] Get:16 http://security.ubuntu.com precise-security/restricted TranslationIndex [70 B] Get:17 http://security.ubuntu.com precise-security/universe TranslationIndex [72 B] Get:18 http://security.ubuntu.com precise-security/main Translation-en [13.6 kB] Get:19 http://security.ubuntu.com precise-security/multiverse Translation-en [587 B] Get:20 http://security.ubuntu.com precise-security/restricted Translation-en [14 B] Get:21 http://security.ubuntu.com precise-security/universe Translation-en [6,261 B] Get:22 http://archive.ubuntu.com precise/restricted i386 Packages [8,431 B] Get:23 http://archive.ubuntu.com precise/universe i386 Packages [4,796 kB] Ign http://ppa.launchpad.net precise InRelease Ign http://ppa.launchpad.net precise InRelease Ign http://ppa.launchpad.net precise InRelease Ign http://ppa.launchpad.net precise InRelease Ign http://ppa.launchpad.net precise InRelease Ign http://ppa.launchpad.net precise InRelease Ign http://ppa.launchpad.net precise InRelease Ign http://ppa.launchpad.net precise InRelease Ign http://ppa.launchpad.net precise InRelease Ign http://ppa.launchpad.net precise InRelease Ign http://ppa.launchpad.net precise InRelease Get:24 http://ppa.launchpad.net precise Release.gpg [316 B] Get:25 http://ppa.launchpad.net precise Release.gpg [316 B] Get:26 http://ppa.launchpad.net precise Release.gpg [316 B] Ign http://ppa.launchpad.net precise Release.gpg Get:27 http://ppa.launchpad.net precise Release.gpg [316 B] Hit http://ppa.launchpad.net precise Release.gpg Get:28 http://ppa.launchpad.net precise Release.gpg [316 B] Get:29 http://ppa.launchpad.net precise Release.gpg [316 B] Hit http://ppa.launchpad.net precise Release.gpg Get:30 http://ppa.launchpad.net precise Release.gpg [316 B] Hit http://ppa.launchpad.net precise Release.gpg Get:31 http://ppa.launchpad.net precise Release [11.9 kB] Get:32 http://ppa.launchpad.net precise Release [11.9 kB] Get:33 http://archive.ubuntu.com precise/multiverse i386 Packages [121 kB] Get:34 http://ppa.launchpad.net precise Release [11.9 kB] Ign http://ppa.launchpad.net precise Release Get:35 http://ppa.launchpad.net precise Release [11.9 kB] Hit http://archive.ubuntu.com precise/main TranslationIndex Hit http://archive.ubuntu.com precise/multiverse TranslationIndex Hit http://ppa.launchpad.net precise Release Hit http://archive.ubuntu.com precise/restricted TranslationIndex Get:36 http://ppa.launchpad.net precise Release [11.9 kB] Hit http://archive.ubuntu.com precise/universe TranslationIndex Get:37 http://ppa.launchpad.net precise Release [11.9 kB] Get:38 http://archive.ubuntu.com precise-updates/main i386 Packages [96.5 kB] Hit http://ppa.launchpad.net precise Release Get:39 http://ppa.launchpad.net precise Release [11.9 kB] Get:40 http://archive.ubuntu.com precise-updates/restricted i386 Packages [770 B] Hit http://ppa.launchpad.net precise Release Get:41 http://archive.ubuntu.com precise-updates/universe i386 Packages [27.7 kB] Get:42 http://ppa.launchpad.net precise/main Sources [524 B] Get:43 http://archive.ubuntu.com precise-updates/multiverse i386 Packages [1,393 B] Get:44 http://ppa.launchpad.net precise/main i386 Packages [507 B] Hit http://archive.ubuntu.com precise-updates/main TranslationIndex Ign http://ppa.launchpad.net precise/main TranslationIndex Hit http://archive.ubuntu.com precise-updates/multiverse TranslationIndex Hit http://archive.ubuntu.com precise-updates/restricted TranslationIndex Get:45 http://ppa.launchpad.net precise/main Sources [932 B] Hit http://archive.ubuntu.com precise-updates/universe TranslationIndex Get:46 http://ppa.launchpad.net precise/main i386 Packages [1,017 B] Get:47 http://archive.ubuntu.com precise-backports/main i386 Packages [559 B] Ign http://ppa.launchpad.net precise/main TranslationIndex Get:48 http://archive.ubuntu.com precise-backports/restricted i386 Packages [14 B] Get:49 http://archive.ubuntu.com precise-backports/universe i386 Packages [1,391 B] Get:50 http://ppa.launchpad.net precise/main Sources [1,402 B] Get:51 http://archive.ubuntu.com precise-backports/multiverse i386 Packages [14 B] Hit http://archive.ubuntu.com precise-backports/main TranslationIndex Get:52 http://ppa.launchpad.net precise/main i386 Packages [1,605 B] Hit http://archive.ubuntu.com precise-backports/multiverse TranslationIndex Ign http://ppa.launchpad.net precise/main TranslationIndex Hit http://archive.ubuntu.com precise-backports/restricted TranslationIndex Hit http://archive.ubuntu.com precise-backports/universe TranslationIndex Hit http://archive.ubuntu.com precise/main Translation-en Ign http://ppa.launchpad.net precise/main TranslationIndex Hit http://archive.ubuntu.com precise/multiverse Translation-en Get:53 http://ppa.launchpad.net precise/main Sources [931 B] Hit http://archive.ubuntu.com precise/restricted Translation-en Get:54 http://ppa.launchpad.net precise/main i386 Packages [1,079 B] Hit http://archive.ubuntu.com precise/universe Translation-en Ign http://ppa.launchpad.net precise/main TranslationIndex Hit http://archive.ubuntu.com precise-updates/main Translation-en Hit http://ppa.launchpad.net precise/main Sources Hit http://archive.ubuntu.com precise-updates/multiverse Translation-en Hit http://ppa.launchpad.net precise/main i386 Packages Hit http://archive.ubuntu.com precise-updates/restricted Translation-en Ign http://ppa.launchpad.net precise/main TranslationIndex Hit http://archive.ubuntu.com precise-updates/universe Translation-en Get:55 http://ppa.launchpad.net precise/main Sources [3,611 B] Hit http://archive.ubuntu.com precise-backports/main Translation-en Get:56 http://ppa.launchpad.net precise/main i386 Packages [2,468 B] Hit http://archive.ubuntu.com precise-backports/multiverse Translation-en Ign http://ppa.launchpad.net precise/main TranslationIndex Hit http://archive.ubuntu.com precise-backports/restricted Translation-en Hit http://archive.ubuntu.com precise-backports/universe Translation-en Get:57 http://ppa.launchpad.net precise/main Sources [1,524 B] Get:58 http://ppa.launchpad.net precise/main i386 Packages [2,719 B] Ign http://ppa.launchpad.net precise/main TranslationIndex Hit http://ppa.launchpad.net precise/main Sources Hit http://ppa.launchpad.net precise/main i386 Packages Ign http://ppa.launchpad.net precise/main TranslationIndex Get:59 http://ppa.launchpad.net precise/main Sources [1,052 B] Get:60 http://ppa.launchpad.net precise/main i386 Packages [1,388 B] Ign http://ppa.launchpad.net precise/main TranslationIndex Get:61 http://ppa.launchpad.net precise/main Sources [1,185 B] Get:62 http://ppa.launchpad.net precise/main i386 Packages [1,698 B] Ign http://ppa.launchpad.net precise/main TranslationIndex Err http://ppa.launchpad.net precise/main Sources 404 Not Found Err http://ppa.launchpad.net precise/main i386 Packages 404 Not Found Ign http://ppa.launchpad.net precise/main Translation-en_US Ign http://ppa.launchpad.net precise/main Translation-en Ign http://ppa.launchpad.net precise/main Translation-en_US Ign http://ppa.launchpad.net precise/main Translation-en Ign http://ppa.launchpad.net precise/main Translation-en_US Ign http://ppa.launchpad.net precise/main Translation-en Ign http://ppa.launchpad.net precise/main Translation-en_US Ign http://ppa.launchpad.net precise/main Translation-en Ign http://ppa.launchpad.net precise/main Translation-en_US Ign http://ppa.launchpad.net precise/main Translation-en Ign http://ppa.launchpad.net precise/main Translation-en_US Ign http://ppa.launchpad.net precise/main Translation-en Ign http://ppa.launchpad.net precise/main Translation-en_US Ign http://ppa.launchpad.net precise/main Translation-en Ign http://ppa.launchpad.net precise/main Translation-en_US Ign http://ppa.launchpad.net precise/main Translation-en Ign http://ppa.launchpad.net precise/main Translation-en_US Ign http://ppa.launchpad.net precise/main Translation-en Ign http://ppa.launchpad.net precise/main Translation-en_US Ign http://ppa.launchpad.net precise/main Translation-en Ign http://ppa.launchpad.net precise/main Translation-en_US Ign http://ppa.launchpad.net precise/main Translation-en Fetched 6,699 kB in 15s (445 kB/s) Reading package lists... Done W: Failed to fetch http://dl.google.com/linux/talkplugin/deb/dists/stable/InRelease W: Failed to fetch http://archive.canonical.com/ubuntu/dists/precise/partner/i18n/Translation-en_US Unable to connect to archive.canonical.com:http: [IP: 91.189.92.150 80] W: Failed to fetch http://archive.canonical.com/ubuntu/dists/precise/partner/i18n/Translation-en Unable to connect to archive.canonical.com:http: [IP: 91.189.92.150 80] W: Failed to fetch http://dl.google.com/linux/chrome/deb/dists/sta

    Read the article

  • Why is Java EE 6 better than Spring ?

    - by arungupta
    Java EE 6 was released over 2 years ago and now there are 14 compliant application servers. In all my talks around the world, a question that is frequently asked is Why should I use Java EE 6 instead of Spring ? There are already several blogs covering that topic: Java EE wins over Spring by Bill Burke Why will I use Java EE instead of Spring in new Enterprise Java projects in 2012 ? by Kai Waehner (more discussion on TSS) Spring to Java EE migration (Part 1 and 2, 3 and 4 coming as well) by David Heffelfinger Spring to Java EE - A Migration Experience by Lincoln Baxter Migrating Spring to Java EE 6 by Bert Ertman and Paul Bakker at NLJUG Moving from Spring to Java EE 6 - The Age of Frameworks is Over at TSS Java EE vs Spring Shootout by Rohit Kelapure and Reza Rehman at JavaOne 2011 Java EE 6 and the Ewoks by Murat Yener Definite excuse to avoid Spring forever - Bert Ertman and Arun Gupta I will try to share my perspective in this blog. First of all, I'd like to start with a note: Thank you Spring framework for filling the interim gap and providing functionality that is now included in the mainstream Java EE 6 application servers. The Java EE platform has evolved over the years learning from frameworks like Spring and provides all the functionality to build an enterprise application. Thank you very much Spring framework! While Spring was revolutionary in its time and is still very popular and quite main stream in the same way Struts was circa 2003, it really is last generation's framework - some people are even calling it legacy. However my theory is "code is king". So my approach is to build/take a simple Hello World CRUD application in Java EE 6 and Spring and compare the deployable artifacts. I started looking at the official tutorial Developing a Spring Framework MVC Application Step-by-Step but it is using the older version 2.5. I wasn't able to find any updated version in the current 3.1 release. Next, I downloaded Spring Tool Suite and thought that would provide some template samples to get started. A least a quick search did not show any handy tutorials - either video or text-based. So I searched and found a link to their SVN repository at src.springframework.org/svn/spring-samples/. I tried the "mvc-basic" sample and the generated WAR file was 4.43 MB. While it was named a "basic" sample it seemed to come with 19 different libraries bundled but it was what I could find: ./WEB-INF/lib/aopalliance-1.0.jar./WEB-INF/lib/hibernate-validator-4.1.0.Final.jar./WEB-INF/lib/jcl-over-slf4j-1.6.1.jar./WEB-INF/lib/joda-time-1.6.2.jar./WEB-INF/lib/joda-time-jsptags-1.0.2.jar./WEB-INF/lib/jstl-1.2.jar./WEB-INF/lib/log4j-1.2.16.jar./WEB-INF/lib/slf4j-api-1.6.1.jar./WEB-INF/lib/slf4j-log4j12-1.6.1.jar./WEB-INF/lib/spring-aop-3.0.5.RELEASE.jar./WEB-INF/lib/spring-asm-3.0.5.RELEASE.jar./WEB-INF/lib/spring-beans-3.0.5.RELEASE.jar./WEB-INF/lib/spring-context-3.0.5.RELEASE.jar./WEB-INF/lib/spring-context-support-3.0.5.RELEASE.jar./WEB-INF/lib/spring-core-3.0.5.RELEASE.jar./WEB-INF/lib/spring-expression-3.0.5.RELEASE.jar./WEB-INF/lib/spring-web-3.0.5.RELEASE.jar./WEB-INF/lib/spring-webmvc-3.0.5.RELEASE.jar./WEB-INF/lib/validation-api-1.0.0.GA.jar And it is not even using any database! The app deployed fine on GlassFish 3.1.2 but the "@Controller Example" link did not work as it was missing the context root. With a bit of tweaking I could deploy the application and assume that the account got created because no error was displayed in the browser or server log. Next I generated the WAR for "mvc-ajax" and the 5.1 MB WAR had 20 JARs (1 removed, 2 added): ./WEB-INF/lib/aopalliance-1.0.jar./WEB-INF/lib/hibernate-validator-4.1.0.Final.jar./WEB-INF/lib/jackson-core-asl-1.6.4.jar./WEB-INF/lib/jackson-mapper-asl-1.6.4.jar./WEB-INF/lib/jcl-over-slf4j-1.6.1.jar./WEB-INF/lib/joda-time-1.6.2.jar./WEB-INF/lib/jstl-1.2.jar./WEB-INF/lib/log4j-1.2.16.jar./WEB-INF/lib/slf4j-api-1.6.1.jar./WEB-INF/lib/slf4j-log4j12-1.6.1.jar./WEB-INF/lib/spring-aop-3.0.5.RELEASE.jar./WEB-INF/lib/spring-asm-3.0.5.RELEASE.jar./WEB-INF/lib/spring-beans-3.0.5.RELEASE.jar./WEB-INF/lib/spring-context-3.0.5.RELEASE.jar./WEB-INF/lib/spring-context-support-3.0.5.RELEASE.jar./WEB-INF/lib/spring-core-3.0.5.RELEASE.jar./WEB-INF/lib/spring-expression-3.0.5.RELEASE.jar./WEB-INF/lib/spring-web-3.0.5.RELEASE.jar./WEB-INF/lib/spring-webmvc-3.0.5.RELEASE.jar./WEB-INF/lib/validation-api-1.0.0.GA.jar 2 more JARs for just doing Ajax. Anyway, deploying this application gave the following error: Caused by: java.lang.NoSuchMethodError: org.codehaus.jackson.map.SerializationConfig.<init>(Lorg/codehaus/jackson/map/ClassIntrospector;Lorg/codehaus/jackson/map/AnnotationIntrospector;Lorg/codehaus/jackson/map/introspect/VisibilityChecker;Lorg/codehaus/jackson/map/jsontype/SubtypeResolver;)V    at org.springframework.samples.mvc.ajax.json.ConversionServiceAwareObjectMapper.<init>(ConversionServiceAwareObjectMapper.java:20)    at org.springframework.samples.mvc.ajax.json.JacksonConversionServiceConfigurer.postProcessAfterInitialization(JacksonConversionServiceConfigurer.java:40)    at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsAfterInitialization(AbstractAutowireCapableBeanFactory.java:407) Seems like some incorrect repos in the "pom.xml". Next one is "mvc-showcase" and the 6.49 MB WAR now has 28 JARs as shown below: ./WEB-INF/lib/aopalliance-1.0.jar./WEB-INF/lib/aspectjrt-1.6.10.jar./WEB-INF/lib/commons-fileupload-1.2.2.jar./WEB-INF/lib/commons-io-2.0.1.jar./WEB-INF/lib/el-api-2.2.jar./WEB-INF/lib/hibernate-validator-4.1.0.Final.jar./WEB-INF/lib/jackson-core-asl-1.8.1.jar./WEB-INF/lib/jackson-mapper-asl-1.8.1.jar./WEB-INF/lib/javax.inject-1.jar./WEB-INF/lib/jcl-over-slf4j-1.6.1.jar./WEB-INF/lib/jdom-1.0.jar./WEB-INF/lib/joda-time-1.6.2.jar./WEB-INF/lib/jstl-api-1.2.jar./WEB-INF/lib/jstl-impl-1.2.jar./WEB-INF/lib/log4j-1.2.16.jar./WEB-INF/lib/rome-1.0.0.jar./WEB-INF/lib/slf4j-api-1.6.1.jar./WEB-INF/lib/slf4j-log4j12-1.6.1.jar./WEB-INF/lib/spring-aop-3.1.0.RELEASE.jar./WEB-INF/lib/spring-asm-3.1.0.RELEASE.jar./WEB-INF/lib/spring-beans-3.1.0.RELEASE.jar./WEB-INF/lib/spring-context-3.1.0.RELEASE.jar./WEB-INF/lib/spring-context-support-3.1.0.RELEASE.jar./WEB-INF/lib/spring-core-3.1.0.RELEASE.jar./WEB-INF/lib/spring-expression-3.1.0.RELEASE.jar./WEB-INF/lib/spring-web-3.1.0.RELEASE.jar./WEB-INF/lib/spring-webmvc-3.1.0.RELEASE.jar./WEB-INF/lib/validation-api-1.0.0.GA.jar The app at least deployed and showed results this time. But still no database! Next I tried building "jpetstore" and got the error: [ERROR] Failed to execute goal on project org.springframework.samples.jpetstore:Could not resolve dependencies for project org.springframework.samples:org.springframework.samples.jpetstore:war:1.0.0-SNAPSHOT: Failed to collect dependencies for [commons-fileupload:commons-fileupload:jar:1.2.1 (compile), org.apache.struts:com.springsource.org.apache.struts:jar:1.2.9 (compile), javax.xml.rpc:com.springsource.javax.xml.rpc:jar:1.1.0 (compile), org.apache.commons:com.springsource.org.apache.commons.dbcp:jar:1.2.2.osgi (compile), commons-io:commons-io:jar:1.3.2 (compile), hsqldb:hsqldb:jar:1.8.0.7 (compile), org.apache.tiles:tiles-core:jar:2.2.0 (compile), org.apache.tiles:tiles-jsp:jar:2.2.0 (compile), org.tuckey:urlrewritefilter:jar:3.1.0 (compile), org.springframework:spring-webmvc:jar:3.0.0.BUILD-SNAPSHOT (compile), org.springframework:spring-orm:jar:3.0.0.BUILD-SNAPSHOT (compile), org.springframework:spring-context-support:jar:3.0.0.BUILD-SNAPSHOT (compile), org.springframework.webflow:spring-js:jar:2.0.7.RELEASE (compile), org.apache.ibatis:com.springsource.com.ibatis:jar:2.3.4.726 (runtime), com.caucho:com.springsource.com.caucho:jar:3.2.1 (compile), org.apache.axis:com.springsource.org.apache.axis:jar:1.4.0 (compile), javax.wsdl:com.springsource.javax.wsdl:jar:1.6.1 (compile), javax.servlet:jstl:jar:1.2 (runtime), org.aspectj:aspectjweaver:jar:1.6.5 (compile), javax.servlet:servlet-api:jar:2.5 (provided), javax.servlet.jsp:jsp-api:jar:2.1 (provided), junit:junit:jar:4.6 (test)]: Failed to read artifact descriptor for org.springframework:spring-webmvc:jar:3.0.0.BUILD-SNAPSHOT: Could not transfer artifact org.springframework:spring-webmvc:pom:3.0.0.BUILD-SNAPSHOT from/to JBoss repository (http://repository.jboss.com/maven2): Access denied to: http://repository.jboss.com/maven2/org/springframework/spring-webmvc/3.0.0.BUILD-SNAPSHOT/spring-webmvc-3.0.0.BUILD-SNAPSHOT.pom It appears the sample is broken - maybe I was pulling from the wrong repository - would be great if someone were to point me at a good target to use here. With a 50% hit on samples in this repository, I started searching through numerous blogs, most of which have either outdated information (using XML-heavy Spring 2.5), some piece of configuration (which is a typical "feature" of Spring) is missing, or too much complexity in the sample. I finally found this blog that worked like a charm. This blog creates a trivial Spring MVC 3 application using Hibernate and MySQL. This application performs CRUD operations on a single table in a database using typical Spring technologies.  I downloaded the sample code from the blog, deployed it on GlassFish 3.1.2 and could CRUD the "person" entity. The source code for this application can be downloaded here. More details on the application statistics below. And then I built a similar CRUD application in Java EE 6 using NetBeans wizards in a couple of minutes. The source code for the application can be downloaded here and the WAR here. The Spring Source Tool Suite may also offer similar wizard-driven capabilities but this blog focus primarily on comparing the runtimes. The lack of STS tutorials was slightly disappointing as well. NetBeans however has tons of text-based and video tutorials and tons of material even by the community. One more bit on the download size of tools bundle ... NetBeans 7.1.1 "All" is 211 MB (which includes GlassFish and Tomcat) Spring Tool Suite  2.9.0 is 347 MB (~ 65% bigger) This blog is not about the tooling comparison so back to the Java EE 6 version of the application .... In order to run the Java EE version on GlassFish, copy the MySQL Connector/J to glassfish3/glassfish/domains/domain1/lib/ext directory and create a JDBC connection pool and JDBC resource as: ./bin/asadmin create-jdbc-connection-pool --datasourceclassname \\ com.mysql.jdbc.jdbc2.optional.MysqlDataSource --restype \\ javax.sql.DataSource --property \\ portNumber=3306:user=mysql:password=mysql:databaseName=mydatabase \\ myConnectionPool ./bin/asadmin create-jdbc-resource --connectionpoolid myConnectionPool jdbc/myDataSource I generated WARs for the two projects and the table below highlights some differences between them: Java EE 6 Spring WAR File Size 0.021030 MB 10.87 MB (~516x) Number of files 20 53 (> 2.5x) Bundled libraries 0 36 Total size of libraries 0 12.1 MB XML files 3 5 LoC in XML files 50 (11 + 15 + 24) 129 (27 + 46 + 16 + 11 + 19) (~ 2.5x) Total .properties files 1 Bundle.properties 2 spring.properties, log4j.properties Cold Deploy 5,339 ms 11,724 ms Second Deploy 481 ms 6,261 ms Third Deploy 528 ms 5,484 ms Fourth Deploy 484 ms 5,576 ms Runtime memory ~73 MB ~101 MB Some points worth highlighting from the table ... 516x WAR file, 10x deployment time - With 12.1 MB of libraries (for a very basic application) bundled in your application, the WAR file size and the deployment time will naturally go higher. The WAR file for Spring-based application is 516x bigger and the deployment time is double during the first deployment and ~ 10x during subsequent deployments. The Java EE 6 application is fully portable and will run on any Java EE 6 compliant application server. 36 libraries in the WAR - There are 14 Java EE 6 compliant application servers today. Each of those servers provide all the functionality like transactions, dependency injection, security, persistence, etc typically required of an enterprise or web application. There is no need to bundle 36 libraries worth 12.1 MB for a trivial CRUD application. These 14 compliant application servers provide all the functionality baked in. Now you can also deploy these libraries in the container but then you don't get the "portability" offered by Spring in that case. Does your typical Spring deployment actually do that ? 3x LoC in XML - The number of XML files is about 1.6x and the LoC is ~ 2.5x. So much XML seems circa 2003 when the Java language had no annotations. The XML files can be further reduced, e.g. faces-config.xml can be replaced without providing i18n, but I just want to compare stock applications. Memory usage - Both the applications were deployed on default GlassFish 3.1.2 installation and any additional memory consumed as part of deployment/access was attributed to the application. This is by no means scientific but at least provides an initial ballpark. This area definitely needs more investigation. Another table that compares typical Java EE 6 compliant application servers and the custom-stack created for a Spring application ... Java EE 6 Spring Web Container ? 53 MB (tcServer 2.6.3 Developer Edition) Security ? 12 MB (Spring Security 3.1.0) Persistence ? 6.3 MB (Hibernate 4.1.0, required) Dependency Injection ? 5.3 MB (Framework) Web Services ? 796 KB (Spring WS 2.0.4) Messaging ? 3.4 MB (RabbitMQ Server 2.7.1) 936 KB (Java client 936) OSGi ? 1.3 MB (Spring OSGi 1.2.1) GlassFish and WebLogic (starting at 33 MB) 83.3 MB There are differentiating factors on both the stacks. But most of the functionality like security, persistence, and dependency injection is baked in a Java EE 6 compliant application server but needs to be individually managed and patched for a Spring application. This very quickly leads to a "stack explosion". The Java EE 6 servers are tested extensively on a variety of platforms in different combinations whereas a Spring application developer is responsible for testing with different JDKs, Operating Systems, Versions, Patches, etc. Oracle has both the leading OSS lightweight server with GlassFish and the leading enterprise Java server with WebLogic Server, both Java EE 6 and both with lightweight deployment options. The Web Container offered as part of a Java EE 6 application server not only deploys your enterprise Java applications but also provide operational management, diagnostics, and mission-critical capabilities required by your applications. The Java EE 6 platform also introduced the Web Profile which is a subset of the specifications from the entire platform. It is targeted at developers of modern web applications offering a reasonably complete stack, composed of standard APIs, and is capable out-of-the-box of addressing the needs of a large class of Web applications. As your applications grow, the stack can grow to the full Java EE 6 platform. The GlassFish Server Web Profile starting at 33MB (smaller than just the non-standard tcServer) provides most of the functionality typically required by a web application. WebLogic provides battle-tested functionality for a high throughput, low latency, and enterprise grade web application. No individual managing or patching, all tested and commercially supported for you! Note that VMWare does have a server, tcServer, but it is non-standard and not even certified to the level of the standard Web Profile most customers expect these days. Customers who choose this risk proprietary lock-in since VMWare does not seem to want to formally certify with either Java EE 6 Enterprise Platform or with Java EE 6 Web Profile but of course it would be great if they were to join the community and help their customers reduce the risk of deploying on VMWare software. Some more points to help you decide choose between Java EE 6 and Spring ... Freedom to choose container - There are 14 Java EE 6 compliant application servers today, with a variety of open source and commercial offerings. A Java EE 6 application can be deployed on any of those containers. So if you deployed your application on GlassFish today and would like to scale up with your demands then you can deploy the same application to WebLogic. And because of the portability of a Java EE 6 application, you can even take it a different vendor altogether. Spring requires a runtime which could be any of these app servers as well. But why use Spring when all the required functionality is already baked into the application server itself ? Spring also has a different definition of portability where they claim to bundle all the libraries in the WAR file and move to any application server. But we saw earlier how bloated that archive could be. The equivalent features in Spring runtime offerings (mainly tcServer) are not all open source, not as mature, and often require manual assembly.  Vendor choice - The Java EE 6 platform is created using the Java Community Process where all the big players like Oracle, IBM, RedHat, and Apache are conritbuting to make the platform successful. Each application server provides the basic Java EE 6 platform compliance and has its own competitive offerings. This allows you to choose an application server for deploying your Java EE 6 applications. If you are not happy with the support or feature of one vendor then you can move your application to a different vendor because of the portability promise offered by the platform. Spring is a set of products from a single company, one price book, one support organization, one sustaining organization, one sales organization, etc. If any of those cause a customer headache, where do you go ? Java EE, backed by multiple vendors, is a safer bet for those that are risk averse. Production support - With Spring, typically you need to get support from two vendors - VMWare and the container provider. With Java EE 6, all of this is typically provided by one vendor. For example, Oracle offers commercial support from systems, operating systems, JDK, application server, and applications on top of them. VMWare certainly offers complete production support but do you really want to put all your eggs in one basket ? Do you really use tcServer ? ;-) Maintainability - With Spring, you are likely building your own distribution with multiple JAR files, integrating, patching, versioning, etc of all those components. Spring's claim is that multiple JAR files allow you to go à la carte and pick the latest versions of different components. But who is responsible for testing whether all these versions work together ? Yep, you got it, its YOU! If something does not work, who patches and maintains the JARs ? Of course, you! Commercial support for such a configuration ? On your own! The Java EE application servers manage all of this for you and provide a well-tested and commercially supported bundle. While it is always good to realize that there is something new and improved that updates and replaces older frameworks like Spring, the good news is not only does a Java EE 6 container offer what is described here, most also will let you deploy and run your Spring applications on them while you go through an upgrade to a more modern architecture. End result, you get the best of both worlds - keeping your legacy investment but moving to a more agile, lightweight world of Java EE 6. A message to the Spring lovers ... The complexity in J2EE 1.2, 1.3, and 1.4 led to the genesis of Spring but that was in 2004. This is 2012 and the name has changed to "Java EE 6" :-) There are tons of improvements in the Java EE platform to make it easy-to-use and powerful. Some examples: Adding @Stateless on a POJO makes it an EJB EJBs can be packaged in a WAR with no special packaging or deployment descriptors "web.xml" and "faces-config.xml" are optional in most of the common cases Typesafe dependency injection is now part of the Java EE platform Add @Path on a POJO allows you to publish it as a RESTful resource EJBs can be used as backing beans for Facelets-driven JSF pages providing full MVC Java EE 6 WARs are known to be kilobytes in size and deployed in milliseconds Tons of other simplifications in the platform and application servers So if you moved away from J2EE to Spring many years ago and have not looked at Java EE 6 (which has been out since Dec 2009) then you should definitely try it out. Just be at least aware of what other alternatives are available instead of restricting yourself to one stack. Here are some workshops and screencasts worth trying: screencast #37 shows how to build an end-to-end application using NetBeans screencast #36 builds the same application using Eclipse javaee-lab-feb2012.pdf is a 3-4 hours self-paced hands-on workshop that guides you to build a comprehensive Java EE 6 application using NetBeans Each city generally has a "spring cleanup" program every year. It allows you to clean up the mess from your house. For your software projects, you don't need to wait for an annual event, just get started and reduce the technical debt now! Move away from your legacy Spring-based applications to a lighter and more modern approach of building enterprise Java applications using Java EE 6. Watch this beautiful presentation that explains how to migrate from Spring -> Java EE 6: List of files in the Java EE 6 project: ./index.xhtml./META-INF./person./person/Create.xhtml./person/Edit.xhtml./person/List.xhtml./person/View.xhtml./resources./resources/css./resources/css/jsfcrud.css./template.xhtml./WEB-INF./WEB-INF/classes./WEB-INF/classes/Bundle.properties./WEB-INF/classes/META-INF./WEB-INF/classes/META-INF/persistence.xml./WEB-INF/classes/org./WEB-INF/classes/org/javaee./WEB-INF/classes/org/javaee/javaeemysql./WEB-INF/classes/org/javaee/javaeemysql/AbstractFacade.class./WEB-INF/classes/org/javaee/javaeemysql/Person.class./WEB-INF/classes/org/javaee/javaeemysql/Person_.class./WEB-INF/classes/org/javaee/javaeemysql/PersonController$1.class./WEB-INF/classes/org/javaee/javaeemysql/PersonController$PersonControllerConverter.class./WEB-INF/classes/org/javaee/javaeemysql/PersonController.class./WEB-INF/classes/org/javaee/javaeemysql/PersonFacade.class./WEB-INF/classes/org/javaee/javaeemysql/util./WEB-INF/classes/org/javaee/javaeemysql/util/JsfUtil.class./WEB-INF/classes/org/javaee/javaeemysql/util/PaginationHelper.class./WEB-INF/faces-config.xml./WEB-INF/web.xml List of files in the Spring 3.x project: ./META-INF ./META-INF/MANIFEST.MF./WEB-INF./WEB-INF/applicationContext.xml./WEB-INF/classes./WEB-INF/classes/log4j.properties./WEB-INF/classes/org./WEB-INF/classes/org/krams ./WEB-INF/classes/org/krams/tutorial ./WEB-INF/classes/org/krams/tutorial/controller ./WEB-INF/classes/org/krams/tutorial/controller/MainController.class ./WEB-INF/classes/org/krams/tutorial/domain ./WEB-INF/classes/org/krams/tutorial/domain/Person.class ./WEB-INF/classes/org/krams/tutorial/service ./WEB-INF/classes/org/krams/tutorial/service/PersonService.class ./WEB-INF/hibernate-context.xml ./WEB-INF/hibernate.cfg.xml ./WEB-INF/jsp ./WEB-INF/jsp/addedpage.jsp ./WEB-INF/jsp/addpage.jsp ./WEB-INF/jsp/deletedpage.jsp ./WEB-INF/jsp/editedpage.jsp ./WEB-INF/jsp/editpage.jsp ./WEB-INF/jsp/personspage.jsp ./WEB-INF/lib ./WEB-INF/lib/antlr-2.7.6.jar ./WEB-INF/lib/aopalliance-1.0.jar ./WEB-INF/lib/c3p0-0.9.1.2.jar ./WEB-INF/lib/cglib-nodep-2.2.jar ./WEB-INF/lib/commons-beanutils-1.8.3.jar ./WEB-INF/lib/commons-collections-3.2.1.jar ./WEB-INF/lib/commons-digester-2.1.jar ./WEB-INF/lib/commons-logging-1.1.1.jar ./WEB-INF/lib/dom4j-1.6.1.jar ./WEB-INF/lib/ejb3-persistence-1.0.2.GA.jar ./WEB-INF/lib/hibernate-annotations-3.4.0.GA.jar ./WEB-INF/lib/hibernate-commons-annotations-3.1.0.GA.jar ./WEB-INF/lib/hibernate-core-3.3.2.GA.jar ./WEB-INF/lib/javassist-3.7.ga.jar ./WEB-INF/lib/jstl-1.1.2.jar ./WEB-INF/lib/jta-1.1.jar ./WEB-INF/lib/junit-4.8.1.jar ./WEB-INF/lib/log4j-1.2.14.jar ./WEB-INF/lib/mysql-connector-java-5.1.14.jar ./WEB-INF/lib/persistence-api-1.0.jar ./WEB-INF/lib/slf4j-api-1.6.1.jar ./WEB-INF/lib/slf4j-log4j12-1.6.1.jar ./WEB-INF/lib/spring-aop-3.0.5.RELEASE.jar ./WEB-INF/lib/spring-asm-3.0.5.RELEASE.jar ./WEB-INF/lib/spring-beans-3.0.5.RELEASE.jar ./WEB-INF/lib/spring-context-3.0.5.RELEASE.jar ./WEB-INF/lib/spring-context-support-3.0.5.RELEASE.jar ./WEB-INF/lib/spring-core-3.0.5.RELEASE.jar ./WEB-INF/lib/spring-expression-3.0.5.RELEASE.jar ./WEB-INF/lib/spring-jdbc-3.0.5.RELEASE.jar ./WEB-INF/lib/spring-orm-3.0.5.RELEASE.jar ./WEB-INF/lib/spring-tx-3.0.5.RELEASE.jar ./WEB-INF/lib/spring-web-3.0.5.RELEASE.jar ./WEB-INF/lib/spring-webmvc-3.0.5.RELEASE.jar ./WEB-INF/lib/standard-1.1.2.jar ./WEB-INF/lib/xml-apis-1.0.b2.jar ./WEB-INF/spring-servlet.xml ./WEB-INF/spring.properties ./WEB-INF/web.xml So, are you excited about Java EE 6 ? Want to get started now ? Here are some resources: Java EE 6 SDK (including runtime, samples, tutorials etc) GlassFish Server Open Source Edition 3.1.2 (Community) Oracle GlassFish Server 3.1.2 (Commercial) Java EE 6 using WebLogic 12c and NetBeans (Video) Java EE 6 with NetBeans and GlassFish (Video) Java EE with Eclipse and GlassFish (Video)

    Read the article

  • Cannot update, apt-get cannot fetch index files

    - by Evan
    I have a fresh install of Ubuntu 11.10 from the iso 'ubuntu-11.10-desktop-amd64.iso'. I installed this in VMWare Fusion 4.1.1 running on OSX 10.7.3. When setting up the VM, I allowed easy install to take care of creating my user and installing VMWare tools. No problems during installation, everything seems to be working great. The problem is that apt-get will NOT update, so I can't do software updates or install any software with apt-get install. I have been searching high and low, and have found several threads covering similar issues. How to fix a ruined package catalog? is one, Update manager generates 404 error while attempting update. Will not update is another, Ubuntu 11.10 Update issue (failed to fetch...) is a third I have tried changing my software source download location to "Main Server" rather than "Server for United States", to no avail. Same errors. Tried sudo apt-get clean, sudo apt-get autoclean, Have done a sudo rm /var/lib/apt/lists/*, still having the exact same problem. As I said, this is a brand new installation as of yesterday evening. Since I know it will be needed, here is my output from a sudo apt-get update: evan@ubuntu:~$ sudo apt-get update Ign http://archive.ubuntu.com oneiric InRelease Ign http://archive.ubuntu.com oneiric-updates InRelease Ign http://archive.ubuntu.com oneiric-backports InRelease Ign http://archive.ubuntu.com oneiric-security InRelease Ign http://archive.ubuntu.com oneiric Release.gpg Ign http://archive.ubuntu.com oneiric-updates Release.gpg Ign http://archive.ubuntu.com oneiric-backports Release.gpg Ign http://archive.ubuntu.com oneiric-security Release.gpg Ign http://archive.ubuntu.com oneiric Release Ign http://archive.ubuntu.com oneiric-updates Release Ign http://archive.ubuntu.com oneiric-backports Release Ign http://archive.ubuntu.com oneiric-security Release Ign http://archive.ubuntu.com oneiric/main TranslationIndex Ign http://archive.ubuntu.com oneiric/multiverse TranslationIndex Ign http://archive.ubuntu.com oneiric/restricted TranslationIndex Ign http://archive.ubuntu.com oneiric/universe TranslationIndex Ign http://archive.ubuntu.com oneiric-updates/main TranslationIndex Ign http://archive.ubuntu.com oneiric-updates/multiverse TranslationIndex Ign http://archive.ubuntu.com oneiric-updates/restricted TranslationIndex Ign http://archive.ubuntu.com oneiric-updates/universe TranslationIndex Ign http://archive.ubuntu.com oneiric-backports/main TranslationIndex Ign http://archive.ubuntu.com oneiric-backports/multiverse TranslationIndex Ign http://archive.ubuntu.com oneiric-backports/restricted TranslationIndex Ign http://archive.ubuntu.com oneiric-backports/universe TranslationIndex Ign http://archive.ubuntu.com oneiric-security/main TranslationIndex Ign http://archive.ubuntu.com oneiric-security/multiverse TranslationIndex Ign http://archive.ubuntu.com oneiric-security/restricted TranslationIndex Ign http://archive.ubuntu.com oneiric-security/universe TranslationIndex Err http://archive.ubuntu.com oneiric/main Sources 404 Not Found Err http://archive.ubuntu.com oneiric/restricted Sources 404 Not Found Err http://archive.ubuntu.com oneiric/universe Sources 404 Not Found Err http://archive.ubuntu.com oneiric/multiverse Sources 404 Not Found Err http://archive.ubuntu.com oneiric/main amd64 Packages 404 Not Found Err http://archive.ubuntu.com oneiric/restricted amd64 Packages 404 Not Found Err http://archive.ubuntu.com oneiric/universe amd64 Packages 404 Not Found Err http://archive.ubuntu.com oneiric/multiverse amd64 Packages 404 Not Found Err http://archive.ubuntu.com oneiric/main i386 Packages 404 Not Found Err http://archive.ubuntu.com oneiric/restricted i386 Packages 404 Not Found Err http://archive.ubuntu.com oneiric/universe i386 Packages 404 Not Found Err http://archive.ubuntu.com oneiric/multiverse i386 Packages 404 Not Found Err http://archive.ubuntu.com oneiric-updates/main Sources 404 Not Found Err http://archive.ubuntu.com oneiric-updates/restricted Sources 404 Not Found Err http://archive.ubuntu.com oneiric-updates/universe Sources 404 Not Found Err http://archive.ubuntu.com oneiric-updates/multiverse Sources 404 Not Found Err http://archive.ubuntu.com oneiric-updates/main amd64 Packages 404 Not Found Err http://archive.ubuntu.com oneiric-updates/restricted amd64 Packages 404 Not Found Err http://archive.ubuntu.com oneiric-updates/universe amd64 Packages 404 Not Found Err http://archive.ubuntu.com oneiric-updates/multiverse amd64 Packages 404 Not Found Err http://archive.ubuntu.com oneiric-updates/main i386 Packages 404 Not Found Err http://archive.ubuntu.com oneiric-updates/restricted i386 Packages 404 Not Found Err http://archive.ubuntu.com oneiric-updates/universe i386 Packages 404 Not Found Err http://archive.ubuntu.com oneiric-updates/multiverse i386 Packages 404 Not Found Ign http://extras.ubuntu.com oneiric InRelease Err http://archive.ubuntu.com oneiric-backports/main Sources 404 Not Found Err http://archive.ubuntu.com oneiric-backports/restricted Sources 404 Not Found Err http://archive.ubuntu.com oneiric-backports/universe Sources 404 Not Found Err http://archive.ubuntu.com oneiric-backports/multiverse Sources 404 Not Found Err http://archive.ubuntu.com oneiric-backports/main amd64 Packages 404 Not Found Err http://archive.ubuntu.com oneiric-backports/restricted amd64 Packages 404 Not Found Err http://archive.ubuntu.com oneiric-backports/universe amd64 Packages 404 Not Found Err http://archive.ubuntu.com oneiric-backports/multiverse amd64 Packages 404 Not Found Err http://archive.ubuntu.com oneiric-backports/main i386 Packages 404 Not Found Err http://archive.ubuntu.com oneiric-backports/restricted i386 Packages 404 Not Found Err http://archive.ubuntu.com oneiric-backports/universe i386 Packages 404 Not Found Err http://archive.ubuntu.com oneiric-backports/multiverse i386 Packages 404 Not Found Err http://archive.ubuntu.com oneiric-security/main Sources 404 Not Found Err http://archive.ubuntu.com oneiric-security/restricted Sources 404 Not Found Err http://archive.ubuntu.com oneiric-security/universe Sources 404 Not Found Err http://archive.ubuntu.com oneiric-security/multiverse Sources 404 Not Found Err http://archive.ubuntu.com oneiric-security/main amd64 Packages 404 Not Found Err http://archive.ubuntu.com oneiric-security/restricted amd64 Packages 404 Not Found Err http://archive.ubuntu.com oneiric-security/universe amd64 Packages 404 Not Found Err http://archive.ubuntu.com oneiric-security/multiverse amd64 Packages 404 Not Found Err http://archive.ubuntu.com oneiric-security/main i386 Packages 404 Not Found Err http://archive.ubuntu.com oneiric-security/restricted i386 Packages 404 Not Found Err http://archive.ubuntu.com oneiric-security/universe i386 Packages 404 Not Found Err http://archive.ubuntu.com oneiric-security/multiverse i386 Packages 404 Not Found Ign http://archive.ubuntu.com oneiric/main Translation-en_US Ign http://archive.ubuntu.com oneiric/main Translation-en Ign http://archive.ubuntu.com oneiric/multiverse Translation-en_US Ign http://archive.ubuntu.com oneiric/multiverse Translation-en Ign http://archive.ubuntu.com oneiric/restricted Translation-en_US Ign http://archive.ubuntu.com oneiric/restricted Translation-en Ign http://archive.ubuntu.com oneiric/universe Translation-en_US Ign http://archive.ubuntu.com oneiric/universe Translation-en Ign http://archive.ubuntu.com oneiric-updates/main Translation-en_US Ign http://archive.ubuntu.com oneiric-updates/main Translation-en Ign http://archive.ubuntu.com oneiric-updates/multiverse Translation-en_US Ign http://archive.ubuntu.com oneiric-updates/multiverse Translation-en Ign http://archive.ubuntu.com oneiric-updates/restricted Translation-en_US Ign http://archive.ubuntu.com oneiric-updates/restricted Translation-en Ign http://archive.ubuntu.com oneiric-updates/universe Translation-en_US Ign http://archive.ubuntu.com oneiric-updates/universe Translation-en Ign http://archive.ubuntu.com oneiric-backports/main Translation-en_US Ign http://archive.ubuntu.com oneiric-backports/main Translation-en Ign http://archive.ubuntu.com oneiric-backports/multiverse Translation-en_US Ign http://archive.ubuntu.com oneiric-backports/multiverse Translation-en Ign http://archive.ubuntu.com oneiric-backports/restricted Translation-en_US Ign http://archive.ubuntu.com oneiric-backports/restricted Translation-en Ign http://archive.ubuntu.com oneiric-backports/universe Translation-en_US Ign http://archive.ubuntu.com oneiric-backports/universe Translation-en Ign http://archive.ubuntu.com oneiric-security/main Translation-en_US Ign http://archive.ubuntu.com oneiric-security/main Translation-en Ign http://archive.ubuntu.com oneiric-security/multiverse Translation-en_US Ign http://archive.ubuntu.com oneiric-security/multiverse Translation-en Ign http://archive.ubuntu.com oneiric-security/restricted Translation-en_US Ign http://archive.ubuntu.com oneiric-security/restricted Translation-en Ign http://archive.ubuntu.com oneiric-security/universe Translation-en_US Ign http://archive.ubuntu.com oneiric-security/universe Translation-en Hit http://extras.ubuntu.com oneiric Release.gpg Hit http://extras.ubuntu.com oneiric Release Hit http://extras.ubuntu.com oneiric/main Sources Hit http://extras.ubuntu.com oneiric/main amd64 Packages Hit http://extras.ubuntu.com oneiric/main i386 Packages Ign http://extras.ubuntu.com oneiric/main TranslationIndex Ign http://extras.ubuntu.com oneiric/main Translation-en_US Ign http://extras.ubuntu.com oneiric/main Translation-en W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric/main/source/Sources 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric/restricted/source/Sources 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric/universe/source/Sources 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric/multiverse/source/Sources 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric/main/binary-amd64/Packages 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric/restricted/binary-amd64/Packages 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric/universe/binary-amd64/Packages 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric/multiverse/binary-amd64/Packages 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric/main/binary-i386/Packages 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric/restricted/binary-i386/Packages 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric/universe/binary-i386/Packages 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric/multiverse/binary-i386/Packages 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric-updates/main/source/Sources 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric-updates/restricted/source/Sources 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric-updates/universe/source/Sources 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric-updates/multiverse/source/Sources 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric-updates/main/binary-amd64/Packages 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric-updates/restricted/binary-amd64/Packages 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric-updates/universe/binary-amd64/Packages 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric-updates/multiverse/binary-amd64/Packages 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric-updates/main/binary-i386/Packages 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric-updates/restricted/binary-i386/Packages 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric-updates/universe/binary-i386/Packages 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric-updates/multiverse/binary-i386/Packages 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric-backports/main/source/Sources 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric-backports/restricted/source/Sources 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric-backports/universe/source/Sources 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric-backports/multiverse/source/Sources 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric-backports/main/binary-amd64/Packages 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric-backports/restricted/binary-amd64/Packages 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric-backports/universe/binary-amd64/Packages 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric-backports/multiverse/binary-amd64/Packages 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric-backports/main/binary-i386/Packages 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric-backports/restricted/binary-i386/Packages 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric-backports/universe/binary-i386/Packages 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric-backports/multiverse/binary-i386/Packages 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric-security/main/source/Sources 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric-security/restricted/source/Sources 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric-security/universe/source/Sources 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric-security/multiverse/source/Sources 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric-security/main/binary-amd64/Packages 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric-security/restricted/binary-amd64/Packages 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric-security/universe/binary-amd64/Packages 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric-security/multiverse/binary-amd64/Packages 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric-security/main/binary-i386/Packages 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric-security/restricted/binary-i386/Packages 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric-security/universe/binary-i386/Packages 404 Not Found W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/oneiric-security/multiverse/binary-i386/Packages 404 Not Found E: Some index files failed to download. They have been ignored, or old ones used instead. Here is my /etc/apt/source.list: evan@ubuntu:~$ cat /etc/apt/sources.list # deb cdrom:[Ubuntu 11.10 _Oneiric Ocelot_ - Release amd64 (20111012)]/ dists/oneiric/main/binary-i386/ # deb cdrom:[Ubuntu 11.10 _Oneiric Ocelot_ - Release amd64 (20111012)]/ oneiric main restricted # See http://help.ubuntu.com/community/UpgradeNotes for how to upgrade to # newer versions of the distribution. deb http://archive.ubuntu.com/ubuntu oneiric main restricted deb-src http://archive.ubuntu.com/ubuntu oneiric main restricted ## Major bug fix updates produced after the final release of the ## distribution. deb http://archive.ubuntu.com/ubuntu oneiric-updates main restricted deb-src http://archive.ubuntu.com/ubuntu oneiric-updates main restricted ## N.B. software from this repository is ENTIRELY UNSUPPORTED by the Ubuntu ## team. Also, please note that software in universe WILL NOT receive any ## review or updates from the Ubuntu security team. deb http://archive.ubuntu.com/ubuntu oneiric universe deb-src http://archive.ubuntu.com/ubuntu oneiric universe deb http://archive.ubuntu.com/ubuntu oneiric-updates universe deb-src http://archive.ubuntu.com/ubuntu oneiric-updates universe ## N.B. software from this repository is ENTIRELY UNSUPPORTED by the Ubuntu ## team, and may not be under a free licence. Please satisfy yourself as to ## your rights to use the software. Also, please note that software in ## multiverse WILL NOT receive any review or updates from the Ubuntu ## security team. deb http://archive.ubuntu.com/ubuntu oneiric multiverse deb-src http://archive.ubuntu.com/ubuntu oneiric multiverse deb http://archive.ubuntu.com/ubuntu oneiric-updates multiverse deb-src http://archive.ubuntu.com/ubuntu oneiric-updates multiverse ## N.B. software from this repository may not have been tested as ## extensively as that contained in the main release, although it includes ## newer versions of some applications which may provide useful features. ## Also, please note that software in backports WILL NOT receive any review ## or updates from the Ubuntu security team. deb http://archive.ubuntu.com/ubuntu oneiric-backports main restricted universe multiverse deb-src http://archive.ubuntu.com/ubuntu oneiric-backports main restricted universe multiverse deb http://archive.ubuntu.com/ubuntu oneiric-security main restricted deb-src http://archive.ubuntu.com/ubuntu oneiric-security main restricted deb http://archive.ubuntu.com/ubuntu oneiric-security universe deb-src http://archive.ubuntu.com/ubuntu oneiric-security universe deb http://archive.ubuntu.com/ubuntu oneiric-security multiverse deb-src http://archive.ubuntu.com/ubuntu oneiric-security multiverse ## Uncomment the following two lines to add software from Canonical's ## 'partner' repository. ## This software is not part of Ubuntu, but is offered by Canonical and the ## respective vendors as a service to Ubuntu users. # deb http://archive.canonical.com/ubuntu oneiric partner # deb-src http://archive.canonical.com/ubuntu oneiric partner ## This software is not part of Ubuntu, but is offered by third-party ## developers who want to ship their latest software. deb http://extras.ubuntu.com/ubuntu oneiric main deb-src http://extras.ubuntu.com/ubuntu oneiric main And here is my output from lsb_release -a: evan@ubuntu:~$ lsb_release -a No LSB modules are available. Distributor ID: Ubuntu Description: Ubuntu 11.10 Release: 11.10 Codename: oneiric If anyone could help me out here, that would be wonderful!

    Read the article

  • Using an alternate JSON Serializer in ASP.NET Web API

    - by Rick Strahl
    The new ASP.NET Web API that Microsoft released alongside MVC 4.0 Beta last week is a great framework for building REST and AJAX APIs. I've been working with it for quite a while now and I really like the way it works and the complete set of features it provides 'in the box'. It's about time that Microsoft gets a decent API for building generic HTTP endpoints into the framework. DataContractJsonSerializer sucks As nice as Web API's overall design is one thing still sucks: The built-in JSON Serialization uses the DataContractJsonSerializer which is just too limiting for many scenarios. The biggest issues I have with it are: No support for untyped values (object, dynamic, Anonymous Types) MS AJAX style Date Formatting Ugly serialization formats for types like Dictionaries To me the most serious issue is dealing with serialization of untyped objects. I have number of applications with AJAX front ends that dynamically reformat data from business objects to fit a specific message format that certain UI components require. The most common scenario I have there are IEnumerable query results from a database with fields from the result set rearranged to fit the sometimes unconventional formats required for the UI components (like jqGrid for example). Creating custom types to fit these messages seems like overkill and projections using Linq makes this much easier to code up. Alas DataContractJsonSerializer doesn't support it. Neither does DataContractSerializer for XML output for that matter. What this means is that you can't do stuff like this in Web API out of the box:public object GetAnonymousType() { return new { name = "Rick", company = "West Wind", entered= DateTime.Now }; } Basically anything that doesn't have an explicit type DataContractJsonSerializer will not let you return. FWIW, the same is true for XmlSerializer which also doesn't work with non-typed values for serialization. The example above is obviously contrived with a hardcoded object graph, but it's not uncommon to get dynamic values returned from queries that have anonymous types for their result projections. Apparently there's a good possibility that Microsoft will ship Json.NET as part of Web API RTM release.  Scott Hanselman confirmed this as a footnote in his JSON Dates post a few days ago. I've heard several other people from Microsoft confirm that Json.NET will be included and be the default JSON serializer, but no details yet in what capacity it will show up. Let's hope it ends up as the default in the box. Meanwhile this post will show you how you can use it today with the beta and get JSON that matches what you should see in the RTM version. What about JsonValue? To be fair Web API DOES include a new JsonValue/JsonObject/JsonArray type that allow you to address some of these scenarios. JsonValue is a new type in the System.Json assembly that can be used to build up an object graph based on a dictionary. It's actually a really cool implementation of a dynamic type that allows you to create an object graph and spit it out to JSON without having to create .NET type first. JsonValue can also receive a JSON string and parse it without having to actually load it into a .NET type (which is something that's been missing in the core framework). This is really useful if you get a JSON result from an arbitrary service and you don't want to explicitly create a mapping type for the data returned. For serialization you can create an object structure on the fly and pass it back as part of an Web API action method like this:public JsonValue GetJsonValue() { dynamic json = new JsonObject(); json.name = "Rick"; json.company = "West Wind"; json.entered = DateTime.Now; dynamic address = new JsonObject(); address.street = "32 Kaiea"; address.zip = "96779"; json.address = address; dynamic phones = new JsonArray(); json.phoneNumbers = phones; dynamic phone = new JsonObject(); phone.type = "Home"; phone.number = "808 123-1233"; phones.Add(phone); phone = new JsonObject(); phone.type = "Home"; phone.number = "808 123-1233"; phones.Add(phone); //var jsonString = json.ToString(); return json; } which produces the following output (formatted here for easier reading):{ name: "rick", company: "West Wind", entered: "2012-03-08T15:33:19.673-10:00", address: { street: "32 Kaiea", zip: "96779" }, phoneNumbers: [ { type: "Home", number: "808 123-1233" }, { type: "Mobile", number: "808 123-1234" }] } If you need to build a simple JSON type on the fly these types work great. But if you have an existing type - or worse a query result/list that's already formatted JsonValue et al. become a pain to work with. As far as I can see there's no way to just throw an object instance at JsonValue and have it convert into JsonValue dictionary. It's a manual process. Using alternate Serializers in Web API So, currently the default serializer in WebAPI is DataContractJsonSeriaizer and I don't like it. You may not either, but luckily you can swap the serializer fairly easily. If you'd rather use the JavaScriptSerializer built into System.Web.Extensions or Json.NET today, it's not too difficult to create a custom MediaTypeFormatter that uses these serializers and can replace or partially replace the native serializer. Here's a MediaTypeFormatter implementation using the ASP.NET JavaScriptSerializer:using System; using System.Net.Http.Formatting; using System.Threading.Tasks; using System.Web.Script.Serialization; using System.Json; using System.IO; namespace Westwind.Web.WebApi { public class JavaScriptSerializerFormatter : MediaTypeFormatter { public JavaScriptSerializerFormatter() { SupportedMediaTypes.Add(new System.Net.Http.Headers.MediaTypeHeaderValue("application/json")); } protected override bool CanWriteType(Type type) { // don't serialize JsonValue structure use default for that if (type == typeof(JsonValue) || type == typeof(JsonObject) || type== typeof(JsonArray) ) return false; return true; } protected override bool CanReadType(Type type) { if (type == typeof(IKeyValueModel)) return false; return true; } protected override System.Threading.Tasks.Taskobject OnReadFromStreamAsync(Type type, System.IO.Stream stream, System.Net.Http.Headers.HttpContentHeaders contentHeaders, FormatterContext formatterContext) { var task = Taskobject.Factory.StartNew(() = { var ser = new JavaScriptSerializer(); string json; using (var sr = new StreamReader(stream)) { json = sr.ReadToEnd(); sr.Close(); } object val = ser.Deserialize(json,type); return val; }); return task; } protected override System.Threading.Tasks.Task OnWriteToStreamAsync(Type type, object value, System.IO.Stream stream, System.Net.Http.Headers.HttpContentHeaders contentHeaders, FormatterContext formatterContext, System.Net.TransportContext transportContext) { var task = Task.Factory.StartNew( () = { var ser = new JavaScriptSerializer(); var json = ser.Serialize(value); byte[] buf = System.Text.Encoding.Default.GetBytes(json); stream.Write(buf,0,buf.Length); stream.Flush(); }); return task; } } } Formatter implementation is pretty simple: You override 4 methods to tell which types you can handle and then handle the input or output streams to create/parse the JSON data. Note that when creating output you want to take care to still allow JsonValue/JsonObject/JsonArray types to be handled by the default serializer so those objects serialize properly - if you let either JavaScriptSerializer or JSON.NET handle them they'd try to render the dictionaries which is very undesirable. If you'd rather use Json.NET here's the JSON.NET version of the formatter:// this code requires a reference to JSON.NET in your project #if true using System; using System.Net.Http.Formatting; using System.Threading.Tasks; using System.Web.Script.Serialization; using System.Json; using Newtonsoft.Json; using System.IO; using Newtonsoft.Json.Converters; namespace Westwind.Web.WebApi { public class JsonNetFormatter : MediaTypeFormatter { public JsonNetFormatter() { SupportedMediaTypes.Add(new System.Net.Http.Headers.MediaTypeHeaderValue("application/json")); } protected override bool CanWriteType(Type type) { // don't serialize JsonValue structure use default for that if (type == typeof(JsonValue) || type == typeof(JsonObject) || type == typeof(JsonArray)) return false; return true; } protected override bool CanReadType(Type type) { if (type == typeof(IKeyValueModel)) return false; return true; } protected override System.Threading.Tasks.Taskobject OnReadFromStreamAsync(Type type, System.IO.Stream stream, System.Net.Http.Headers.HttpContentHeaders contentHeaders, FormatterContext formatterContext) { var task = Taskobject.Factory.StartNew(() = { var settings = new JsonSerializerSettings() { NullValueHandling = NullValueHandling.Ignore, }; var sr = new StreamReader(stream); var jreader = new JsonTextReader(sr); var ser = new JsonSerializer(); ser.Converters.Add(new IsoDateTimeConverter()); object val = ser.Deserialize(jreader, type); return val; }); return task; } protected override System.Threading.Tasks.Task OnWriteToStreamAsync(Type type, object value, System.IO.Stream stream, System.Net.Http.Headers.HttpContentHeaders contentHeaders, FormatterContext formatterContext, System.Net.TransportContext transportContext) { var task = Task.Factory.StartNew( () = { var settings = new JsonSerializerSettings() { NullValueHandling = NullValueHandling.Ignore, }; string json = JsonConvert.SerializeObject(value, Formatting.Indented, new JsonConverter[1] { new IsoDateTimeConverter() } ); byte[] buf = System.Text.Encoding.Default.GetBytes(json); stream.Write(buf,0,buf.Length); stream.Flush(); }); return task; } } } #endif   One advantage of the Json.NET serializer is that you can specify a few options on how things are formatted and handled. You get null value handling and you can plug in the IsoDateTimeConverter which is nice to product proper ISO dates that I would expect any Json serializer to output these days. Hooking up the Formatters Once you've created the custom formatters you need to enable them for your Web API application. To do this use the GlobalConfiguration.Configuration object and add the formatter to the Formatters collection. Here's what this looks like hooked up from Application_Start in a Web project:protected void Application_Start(object sender, EventArgs e) { // Action based routing (used for RPC calls) RouteTable.Routes.MapHttpRoute( name: "StockApi", routeTemplate: "stocks/{action}/{symbol}", defaults: new { symbol = RouteParameter.Optional, controller = "StockApi" } ); // WebApi Configuration to hook up formatters and message handlers // optional RegisterApis(GlobalConfiguration.Configuration); } public static void RegisterApis(HttpConfiguration config) { // Add JavaScriptSerializer formatter instead - add at top to make default //config.Formatters.Insert(0, new JavaScriptSerializerFormatter()); // Add Json.net formatter - add at the top so it fires first! // This leaves the old one in place so JsonValue/JsonObject/JsonArray still are handled config.Formatters.Insert(0, new JsonNetFormatter()); } One thing to remember here is the GlobalConfiguration object which is Web API's static configuration instance. I think this thing is seriously misnamed given that GlobalConfiguration could stand for anything and so is hard to discover if you don't know what you're looking for. How about WebApiConfiguration or something more descriptive? Anyway, once you know what it is you can use the Formatters collection to insert your custom formatter. Note that I insert my formatter at the top of the list so it takes precedence over the default formatter. I also am not removing the old formatter because I still want JsonValue/JsonObject/JsonArray to be handled by the default serialization mechanism. Since they process in sequence and I exclude processing for these types JsonValue et al. still get properly serialized/deserialized. Summary Currently DataContractJsonSerializer in Web API is a pain, but at least we have the ability with relatively limited effort to replace the MediaTypeFormatter and plug in our own JSON serializer. This is useful for many scenarios - if you have existing client applications that used MVC JsonResult or ASP.NET AJAX results from ASMX AJAX services you can plug in the JavaScript serializer and get exactly the same serializer you used in the past so your results will be the same and don't potentially break clients. JSON serializers do vary a bit in how they serialize some of the more complex types (like Dictionaries and dates for example) and so if you're migrating it might be helpful to ensure your client code doesn't break when you switch to ASP.NET Web API. Going forward it looks like Microsoft is planning on plugging in Json.Net into Web API and make that the default. I think that's an awesome choice since Json.net has been around forever, is fast and easy to use and provides a ton of functionality as part of this great library. I just wish Microsoft would have figured this out sooner instead of now at the last minute integrating with it especially given that Json.Net has a similar set of lower level JSON objects JsonValue/JsonObject etc. which now will end up being duplicated by the native System.Json stuff. It's not like we don't already have enough confusion regarding which JSON serializer to use (JavaScriptSerializer, DataContractJsonSerializer, JsonValue/JsonObject/JsonArray and now Json.net). For years I've been using my own JSON serializer because the built in choices are both limited. However, with an official encorsement of Json.Net I'm happily moving on to use that in my applications. Let's see and hope Microsoft gets this right before ASP.NET Web API goes gold.© Rick Strahl, West Wind Technologies, 2005-2012Posted in Web Api  AJAX  ASP.NET   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • WCF WS-Security and WSE Nonce Authentication

    - by Rick Strahl
    WCF makes it fairly easy to access WS-* Web Services, except when you run into a service format that it doesn't support. Even then WCF provides a huge amount of flexibility to make the service clients work, however finding the proper interfaces to make that happen is not easy to discover and for the most part undocumented unless you're lucky enough to run into a blog, forum or StackOverflow post on the matter. This is definitely true for the Password Nonce as part of the WS-Security/WSE protocol, which is not natively supported in WCF. Specifically I had a need to create a WCF message on the client that includes a WS-Security header that looks like this from their spec document:<soapenv:Header> <wsse:Security soapenv:mustUnderstand="1" xmlns:wsse="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd"> <wsse:UsernameToken wsu:Id="UsernameToken-8" xmlns:wsu="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd"> <wsse:Username>TeStUsErNaMe1</wsse:Username> <wsse:Password Type="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-username-token-profile-1.0#PasswordText" >TeStPaSsWoRd1</wsse:Password> <wsse:Nonce EncodingType="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-soap-message-security-1.0#Base64Binary" >f8nUe3YupTU5ISdCy3X9Gg==</wsse:Nonce> <wsu:Created>2011-05-04T19:01:40.981Z</wsu:Created> </wsse:UsernameToken> </wsse:Security> </soapenv:Header> Specifically, the Nonce and Created keys are what WCF doesn't create or have a built in formatting for. Why is there a nonce? My first thought here was WTF? The username and password are there in clear text, what does the Nonce accomplish? The Nonce and created keys are are part of WSE Security specification and are meant to allow the server to detect and prevent replay attacks. The hashed nonce should be unique per request which the server can store and check for before running another request thus ensuring that a request is not replayed with exactly the same values. Basic ServiceUtl Import - not much Luck The first thing I did when I imported this service with a service reference was to simply import it as a Service Reference. The Add Service Reference import automatically detects that WS-Security is required and appropariately adds the WS-Security to the basicHttpBinding in the config file:<?xml version="1.0" encoding="utf-8" ?> <configuration> <system.serviceModel> <bindings> <basicHttpBinding> <binding name="RealTimeOnlineSoapBinding"> <security mode="Transport" /> </binding> <binding name="RealTimeOnlineSoapBinding1" /> </basicHttpBinding> </bindings> <client> <endpoint address="https://notarealurl.com:443/services/RealTimeOnline" binding="basicHttpBinding" bindingConfiguration="RealTimeOnlineSoapBinding" contract="RealTimeOnline.RealTimeOnline" name="RealTimeOnline" /> </client> </system.serviceModel> </configuration> If if I run this as is using code like this:var client = new RealTimeOnlineClient(); client.ClientCredentials.UserName.UserName = "TheUsername"; client.ClientCredentials.UserName.Password = "ThePassword"; … I get nothing in terms of WS-Security headers. The request is sent, but the the binding expects transport level security to be applied, rather than message level security. To fix this so that a WS-Security message header is sent the security mode can be changed to: <security mode="TransportWithMessageCredential" /> Now if I re-run I at least get a WS-Security header which looks like this:<s:Envelope xmlns:s="http://schemas.xmlsoap.org/soap/envelope/" xmlns:u="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd"> <s:Header> <o:Security s:mustUnderstand="1" xmlns:o="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd"> <u:Timestamp u:Id="_0"> <u:Created>2012-11-24T02:55:18.011Z</u:Created> <u:Expires>2012-11-24T03:00:18.011Z</u:Expires> </u:Timestamp> <o:UsernameToken u:Id="uuid-18c215d4-1106-40a5-8dd1-c81fdddf19d3-1"> <o:Username>TheUserName</o:Username> <o:Password Type="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-username-token-profile-1.0#PasswordText" >ThePassword</o:Password> </o:UsernameToken> </o:Security> </s:Header> Closer! Now the WS-Security header is there along with a timestamp field (which might not be accepted by some WS-Security expecting services), but there's no Nonce or created timestamp as required by my original service. Using a CustomBinding instead My next try was to go with a CustomBinding instead of basicHttpBinding as it allows a bit more control over the protocol and transport configurations for the binding. Specifically I can explicitly specify the message protocol(s) used. Using configuration file settings here's what the config file looks like:<?xml version="1.0"?> <configuration> <system.serviceModel> <bindings> <customBinding> <binding name="CustomSoapBinding"> <security includeTimestamp="false" authenticationMode="UserNameOverTransport" defaultAlgorithmSuite="Basic256" requireDerivedKeys="false" messageSecurityVersion="WSSecurity10WSTrustFebruary2005WSSecureConversationFebruary2005WSSecurityPolicy11BasicSecurityProfile10"> </security> <textMessageEncoding messageVersion="Soap11"></textMessageEncoding> <httpsTransport maxReceivedMessageSize="2000000000"/> </binding> </customBinding> </bindings> <client> <endpoint address="https://notrealurl.com:443/services/RealTimeOnline" binding="customBinding" bindingConfiguration="CustomSoapBinding" contract="RealTimeOnline.RealTimeOnline" name="RealTimeOnline" /> </client> </system.serviceModel> <startup> <supportedRuntime version="v4.0" sku=".NETFramework,Version=v4.0"/> </startup> </configuration> This ends up creating a cleaner header that's missing the timestamp field which can cause some services problems. The WS-Security header output generated with the above looks like this:<s:Header> <o:Security s:mustUnderstand="1" xmlns:o="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd"> <o:UsernameToken u:Id="uuid-291622ca-4c11-460f-9886-ac1c78813b24-1"> <o:Username>TheUsername</o:Username> <o:Password Type="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-username-token-profile-1.0#PasswordText" >ThePassword</o:Password> </o:UsernameToken> </o:Security> </s:Header> This is closer as it includes only the username and password. The key here is the protocol for WS-Security:messageSecurityVersion="WSSecurity10WSTrustFebruary2005WSSecureConversationFebruary2005WSSecurityPolicy11BasicSecurityProfile10" which explicitly specifies the protocol version. There are several variants of this specification but none of them seem to support the nonce unfortunately. This protocol does allow for optional omission of the Nonce and created timestamp provided (which effectively makes those keys optional). With some services I tried that requested a Nonce just using this protocol actually worked where the default basicHttpBinding failed to connect, so this is a possible solution for access to some services. Unfortunately for my target service that was not an option. The nonce has to be there. Creating Custom ClientCredentials As it turns out WCF doesn't have support for the Digest Nonce as part of WS-Security, and so as far as I can tell there's no way to do it just with configuration settings. I did a bunch of research on this trying to find workarounds for this, and I did find a couple of entries on StackOverflow as well as on the MSDN forums. However, none of these are particularily clear and I ended up using bits and pieces of several of them to arrive at a working solution in the end. http://stackoverflow.com/questions/896901/wcf-adding-nonce-to-usernametoken http://social.msdn.microsoft.com/Forums/en-US/wcf/thread/4df3354f-0627-42d9-b5fb-6e880b60f8ee The latter forum message is the more useful of the two (the last message on the thread in particular) and it has most of the information required to make this work. But it took some experimentation for me to get this right so I'll recount the process here maybe a bit more comprehensively. In order for this to work a number of classes have to be overridden: ClientCredentials ClientCredentialsSecurityTokenManager WSSecurityTokenizer The idea is that we need to create a custom ClientCredential class to hold the custom properties so they can be set from the UI or via configuration settings. The TokenManager and Tokenizer are mainly required to allow the custom credentials class to flow through the WCF pipeline and eventually provide custom serialization. Here are the three classes required and their full implementations:public class CustomCredentials : ClientCredentials { public CustomCredentials() { } protected CustomCredentials(CustomCredentials cc) : base(cc) { } public override System.IdentityModel.Selectors.SecurityTokenManager CreateSecurityTokenManager() { return new CustomSecurityTokenManager(this); } protected override ClientCredentials CloneCore() { return new CustomCredentials(this); } } public class CustomSecurityTokenManager : ClientCredentialsSecurityTokenManager { public CustomSecurityTokenManager(CustomCredentials cred) : base(cred) { } public override System.IdentityModel.Selectors.SecurityTokenSerializer CreateSecurityTokenSerializer(System.IdentityModel.Selectors.SecurityTokenVersion version) { return new CustomTokenSerializer(System.ServiceModel.Security.SecurityVersion.WSSecurity11); } } public class CustomTokenSerializer : WSSecurityTokenSerializer { public CustomTokenSerializer(SecurityVersion sv) : base(sv) { } protected override void WriteTokenCore(System.Xml.XmlWriter writer, System.IdentityModel.Tokens.SecurityToken token) { UserNameSecurityToken userToken = token as UserNameSecurityToken; string tokennamespace = "o"; DateTime created = DateTime.Now; string createdStr = created.ToString("yyyy-MM-ddThh:mm:ss.fffZ"); // unique Nonce value - encode with SHA-1 for 'randomness' // in theory the nonce could just be the GUID by itself string phrase = Guid.NewGuid().ToString(); var nonce = GetSHA1String(phrase); // in this case password is plain text // for digest mode password needs to be encoded as: // PasswordAsDigest = Base64(SHA-1(Nonce + Created + Password)) // and profile needs to change to //string password = GetSHA1String(nonce + createdStr + userToken.Password); string password = userToken.Password; writer.WriteRaw(string.Format( "<{0}:UsernameToken u:Id=\"" + token.Id + "\" xmlns:u=\"http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd\">" + "<{0}:Username>" + userToken.UserName + "</{0}:Username>" + "<{0}:Password Type=\"http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-username-token-profile-1.0#PasswordText\">" + password + "</{0}:Password>" + "<{0}:Nonce EncodingType=\"http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-soap-message-security-1.0#Base64Binary\">" + nonce + "</{0}:Nonce>" + "<u:Created>" + createdStr + "</u:Created></{0}:UsernameToken>", tokennamespace)); } protected string GetSHA1String(string phrase) { SHA1CryptoServiceProvider sha1Hasher = new SHA1CryptoServiceProvider(); byte[] hashedDataBytes = sha1Hasher.ComputeHash(Encoding.UTF8.GetBytes(phrase)); return Convert.ToBase64String(hashedDataBytes); } } Realistically only the CustomTokenSerializer has any significant code in. The code there deals with actually serializing the custom credentials using low level XML semantics by writing output into an XML writer. I can't take credit for this code - most of the code comes from the MSDN forum post mentioned earlier - I made a few adjustments to simplify the nonce generation and also added some notes to allow for PasswordDigest generation. Per spec the nonce is nothing more than a unique value that's supposed to be 'random'. I'm thinking that this value can be any string that's unique and a GUID on its own probably would have sufficed. Comments on other posts that GUIDs can be potentially guessed are highly exaggerated to say the least IMHO. To satisfy even that aspect though I added the SHA1 encryption and binary decoding to give a more random value that would be impossible to 'guess'. The original example from the forum post used another level of encoding and decoding to string in between - but that really didn't accomplish anything but extra overhead. The header output generated from this looks like this:<s:Header> <o:Security s:mustUnderstand="1" xmlns:o="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd"> <o:UsernameToken u:Id="uuid-f43d8b0d-0ebb-482e-998d-f544401a3c91-1" xmlns:u="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd"> <o:Username>TheUsername</o:Username> <o:Password Type="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-username-token-profile-1.0#PasswordText">ThePassword</o:Password> <o:Nonce EncodingType="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-soap-message-security-1.0#Base64Binary" >PjVE24TC6HtdAnsf3U9c5WMsECY=</o:Nonce> <u:Created>2012-11-23T07:10:04.670Z</u:Created> </o:UsernameToken> </o:Security> </s:Header> which is exactly as it should be. Password Digest? In my case the password is passed in plain text over an SSL connection, so there's no digest required so I was done with the code above. Since I don't have a service handy that requires a password digest,  I had no way of testing the code for the digest implementation, but here is how this is likely to work. If you need to pass a digest encoded password things are a little bit trickier. The password type namespace needs to change to: http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-username-token-profile-1.0#Digest and then the password value needs to be encoded. The format for password digest encoding is this: Base64(SHA-1(Nonce + Created + Password)) and it can be handled in the code above with this code (that's commented in the snippet above): string password = GetSHA1String(nonce + createdStr + userToken.Password); The entire WriteTokenCore method for digest code looks like this:protected override void WriteTokenCore(System.Xml.XmlWriter writer, System.IdentityModel.Tokens.SecurityToken token) { UserNameSecurityToken userToken = token as UserNameSecurityToken; string tokennamespace = "o"; DateTime created = DateTime.Now; string createdStr = created.ToString("yyyy-MM-ddThh:mm:ss.fffZ"); // unique Nonce value - encode with SHA-1 for 'randomness' // in theory the nonce could just be the GUID by itself string phrase = Guid.NewGuid().ToString(); var nonce = GetSHA1String(phrase); string password = GetSHA1String(nonce + createdStr + userToken.Password); writer.WriteRaw(string.Format( "<{0}:UsernameToken u:Id=\"" + token.Id + "\" xmlns:u=\"http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd\">" + "<{0}:Username>" + userToken.UserName + "</{0}:Username>" + "<{0}:Password Type=\"http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-username-token-profile-1.0#Digest\">" + password + "</{0}:Password>" + "<{0}:Nonce EncodingType=\"http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-soap-message-security-1.0#Base64Binary\">" + nonce + "</{0}:Nonce>" + "<u:Created>" + createdStr + "</u:Created></{0}:UsernameToken>", tokennamespace)); } I had no service to connect to to try out Digest auth - if you end up needing it and get it to work please drop a comment… How to use the custom Credentials The easiest way to use the custom credentials is to create the client in code. Here's a factory method I use to create an instance of my service client:  public static RealTimeOnlineClient CreateRealTimeOnlineProxy(string url, string username, string password) { if (string.IsNullOrEmpty(url)) url = "https://notrealurl.com:443/cows/services/RealTimeOnline"; CustomBinding binding = new CustomBinding(); var security = TransportSecurityBindingElement.CreateUserNameOverTransportBindingElement(); security.IncludeTimestamp = false; security.DefaultAlgorithmSuite = SecurityAlgorithmSuite.Basic256; security.MessageSecurityVersion = MessageSecurityVersion.WSSecurity10WSTrustFebruary2005WSSecureConversationFebruary2005WSSecurityPolicy11BasicSecurityProfile10; var encoding = new TextMessageEncodingBindingElement(); encoding.MessageVersion = MessageVersion.Soap11; var transport = new HttpsTransportBindingElement(); transport.MaxReceivedMessageSize = 20000000; // 20 megs binding.Elements.Add(security); binding.Elements.Add(encoding); binding.Elements.Add(transport); RealTimeOnlineClient client = new RealTimeOnlineClient(binding, new EndpointAddress(url)); // to use full client credential with Nonce uncomment this code: // it looks like this might not be required - the service seems to work without it client.ChannelFactory.Endpoint.Behaviors.Remove<System.ServiceModel.Description.ClientCredentials>(); client.ChannelFactory.Endpoint.Behaviors.Add(new CustomCredentials()); client.ClientCredentials.UserName.UserName = username; client.ClientCredentials.UserName.Password = password; return client; } This returns a service client that's ready to call other service methods. The key item in this code is the ChannelFactory endpoint behavior modification that that first removes the original ClientCredentials and then adds the new one. The ClientCredentials property on the client is read only and this is the way it has to be added.   Summary It's a bummer that WCF doesn't suport WSE Security authentication with nonce values out of the box. From reading the comments in posts/articles while I was trying to find a solution, I found that this feature was omitted by design as this protocol is considered unsecure. While I agree that plain text passwords are rarely a good idea even if they go over secured SSL connection as WSE Security does, there are unfortunately quite a few services (mosly Java services I suspect) that use this protocol. I've run into this twice now and trying to find a solution online I can see that this is not an isolated problem - many others seem to have struggled with this. It seems there are about a dozen questions about this on StackOverflow all with varying incomplete answers. Hopefully this post provides a little more coherent content in one place. Again I marvel at WCF and its breadth of support for protocol features it has in a single tool. And even when it can't handle something there are ways to get it working via extensibility. But at the same time I marvel at how freaking difficult it is to arrive at these solutions. I mean there's no way I could have ever figured this out on my own. It takes somebody working on the WCF team or at least being very, very intricately involved in the innards of WCF to figure out the interconnection of the various objects to do this from scratch. Luckily this is an older problem that has been discussed extensively online and I was able to cobble together a solution from the online content. I'm glad it worked out that way, but it feels dirty and incomplete in that there's a whole learning path that was omitted to get here… Man am I glad I'm not dealing with SOAP services much anymore. REST service security - even when using some sort of federation is a piece of cake by comparison :-) I'm sure once standards bodies gets involved we'll be right back in security standard hell…© Rick Strahl, West Wind Technologies, 2005-2012Posted in WCF  Web Services   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • 12.04: Apt-Get Update: failure to fetch; can't connect to any sources

    - by weberc2
    I realize there are dozens of "apt-get update: failure to fetch" questions (I read through all I could find), but my present circumstance is unique to 12.04 and it affects all sources; not just launchpad. Additionally, I've tried several different servers in Europe and the U.S. as well as the "main server" (wherever that is) and they all yield the same result: I can't connect to any software sources. Additionally, I'm fairly certain the problem stems from the upgrade from 11.10-12.04 I performed this morning, as updates worked immediately before. Updates from the Update Manager worked fine and I could download some things (mutter) from the Software Center without incident, which makes me think I can connect to some subset of the Ubuntu servers (however, several other Ubuntu servers--like extras--and some canonical servers are listed as 'unable to connect'). Here is the output from sudo apt-get update: sudo apt-get update Ign http://ftp.u-picardie.fr precise InRelease Ign http://archive.canonical.com precise InRelease Ign http://ftp.u-picardie.fr precise-updates InRelease Ign http://ftp.u-picardie.fr precise-backports InRelease Err http://ftp.u-picardie.fr precise-security InRelease Err http://ftp.u-picardie.fr precise Release.gpg Unable to connect to ftp.u-picardie.fr:http: Err http://ftp.u-picardie.fr precise-updates Release.gpg Unable to connect to ftp.u-picardie.fr:http: Err http://ftp.u-picardie.fr precise-backports Release.gpg Unable to connect to ftp.u-picardie.fr:http: Err http://ftp.u-picardie.fr precise-security Release.gpg Unable to connect to ftp.u-picardie.fr:http: Hit http://archive.canonical.com precise Release.gpg Hit http://archive.canonical.com precise Release Hit http://archive.canonical.com precise/partner i386 Packages Ign http://archive.canonical.com precise/partner TranslationIndex Ign http://dl.google.com stable InRelease Ign http://dl.google.com stable InRelease Err http://archive.canonical.com precise/partner Translation-en_US Unable to connect to archive.canonical.com:http: [IP: 91.189.92.150 80] Err http://archive.canonical.com precise/partner Translation-en Unable to connect to archive.canonical.com:http: [IP: 91.189.92.150 80] Ign http://extras.ubuntu.com precise InRelease Get:1 http://dl.google.com stable Release.gpg [198 B] Err http://extras.ubuntu.com precise Release.gpg Could not connect to extras.ubuntu.com:80 (91.189.88.33). - connect (111: Connection refused) Ign http://ppa.launchpad.net precise InRelease Err http://ppa.launchpad.net precise InRelease Err http://ppa.launchpad.net precise InRelease Err http://ppa.launchpad.net precise InRelease Err http://ppa.launchpad.net precise InRelease Err http://ppa.launchpad.net precise InRelease Err http://ppa.launchpad.net precise InRelease Err http://ppa.launchpad.net precise InRelease Err http://ppa.launchpad.net precise InRelease Get:2 http://dl.google.com stable Release.gpg [198 B] Err http://ppa.launchpad.net precise Release.gpg Unable to connect to ppa.launchpad.net:http: Err http://ppa.launchpad.net precise Release.gpg Unable to connect to ppa.launchpad.net:http: Err http://ppa.launchpad.net precise Release.gpg Unable to connect to ppa.launchpad.net:http: Err http://ppa.launchpad.net precise Release.gpg Unable to connect to ppa.launchpad.net:http: Err http://ppa.launchpad.net precise Release.gpg Unable to connect to ppa.launchpad.net:http: Err http://ppa.launchpad.net precise Release.gpg Unable to connect to ppa.launchpad.net:http: Err http://ppa.launchpad.net precise Release.gpg Unable to connect to ppa.launchpad.net:http: Err http://ppa.launchpad.net precise Release.gpg Unable to connect to ppa.launchpad.net:http: Err http://ppa.launchpad.net precise Release.gpg Unable to connect to ppa.launchpad.net:http: Get:3 http://dl.google.com stable Release [1,347 B] Get:4 http://dl.google.com stable Release [1,347 B] Get:5 http://dl.google.com stable/main i386 Packages [1,268 B] Ign http://dl.google.com stable/main TranslationIndex Get:6 http://dl.google.com stable/main i386 Packages [769 B] Ign http://dl.google.com stable/main TranslationIndex Ign http://dl.google.com stable/main Translation-en_US Ign http://dl.google.com stable/main Translation-en Ign http://dl.google.com stable/main Translation-en_US Ign http://dl.google.com stable/main Translation-en Fetched 5,127 B in 7s (673 B/s) Reading package lists... Done W: Failed to fetch http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/dists/precise-security/InRelease W: Failed to fetch http://ppa.launchpad.net/elementary-os/stable/ubuntu/dists/precise/InRelease W: Failed to fetch http://ppa.launchpad.net/elementaryart/elementary-dev/ubuntu/dists/precise/InRelease W: Failed to fetch http://ppa.launchpad.net/midori/ppa/ubuntu/dists/precise/InRelease W: Failed to fetch http://ppa.launchpad.net/nemequ/sqlheavy/ubuntu/dists/precise/InRelease W: Failed to fetch http://ppa.launchpad.net/ricotz/docky/ubuntu/dists/precise/InRelease W: Failed to fetch http://ppa.launchpad.net/sgringwe/beatbox/ubuntu/dists/precise/InRelease W: Failed to fetch http://ppa.launchpad.net/webupd8team/y-ppa-manager/ubuntu/dists/precise/InRelease W: Failed to fetch http://ppa.launchpad.net/yorba/ppa/ubuntu/dists/precise/InRelease W: Failed to fetch http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/dists/precise/Release.gpg Unable to connect to ftp.u-picardie.fr:http: W: Failed to fetch http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/dists/precise-updates/Release.gpg Unable to connect to ftp.u-picardie.fr:http: W: Failed to fetch http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/dists/precise-backports/Release.gpg Unable to connect to ftp.u-picardie.fr:http: W: Failed to fetch http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/dists/precise-security/Release.gpg Unable to connect to ftp.u-picardie.fr:http: W: Failed to fetch http://archive.canonical.com/ubuntu/dists/precise/partner/i18n/Translation-en_US Unable to connect to archive.canonical.com:http: [IP: 91.189.92.150 80] W: Failed to fetch http://archive.canonical.com/ubuntu/dists/precise/partner/i18n/Translation-en Unable to connect to archive.canonical.com:http: [IP: 91.189.92.150 80] W: Failed to fetch http://extras.ubuntu.com/ubuntu/dists/precise/Release.gpg Could not connect to extras.ubuntu.com:80 (91.189.88.33). - connect (111: Connection refused) W: Failed to fetch http://ppa.launchpad.net/caffeine-developers/ppa/ubuntu/dists/precise/Release.gpg Unable to connect to ppa.launchpad.net:http: W: Failed to fetch http://ppa.launchpad.net/elementary-os/stable/ubuntu/dists/precise/Release.gpg Unable to connect to ppa.launchpad.net:http: W: Failed to fetch http://ppa.launchpad.net/elementaryart/elementary-dev/ubuntu/dists/precise/Release.gpg Unable to connect to ppa.launchpad.net:http: W: Failed to fetch http://ppa.launchpad.net/midori/ppa/ubuntu/dists/precise/Release.gpg Unable to connect to ppa.launchpad.net:http: W: Failed to fetch http://ppa.launchpad.net/nemequ/sqlheavy/ubuntu/dists/precise/Release.gpg Unable to connect to ppa.launchpad.net:http: W: Failed to fetch http://ppa.launchpad.net/ricotz/docky/ubuntu/dists/precise/Release.gpg Unable to connect to ppa.launchpad.net:http: W: Failed to fetch http://ppa.launchpad.net/sgringwe/beatbox/ubuntu/dists/precise/Release.gpg Unable to connect to ppa.launchpad.net:http: W: Failed to fetch http://ppa.launchpad.net/webupd8team/y-ppa-manager/ubuntu/dists/precise/Release.gpg Unable to connect to ppa.launchpad.net:http: W: Failed to fetch http://ppa.launchpad.net/yorba/ppa/ubuntu/dists/precise/Release.gpg Unable to connect to ppa.launchpad.net:http: W: Some index files failed to download. They have been ignored, or old ones used instead. W: Duplicate sources.list entry http://ppa.launchpad.net/nemequ/sqlheavy/ubuntu/ precise/main i386 Packages (/var/lib/apt/lists/ppa.launchpad.net_nemequ_sqlheavy_ubuntu_dists_precise_main_binary-i386_Packages) W: Duplicate sources.list entry http://ppa.launchpad.net/sgringwe/beatbox/ubuntu/ precise/main i386 Packages (/var/lib/apt/lists/ppa.launchpad.net_sgringwe_beatbox_ubuntu_dists_precise_main_binary-i386_Packages) Contents of /etc/apt/sources.list: # deb cdrom:[Ubuntu 11.10 _Oneiric Ocelot_ - Release i386 (20111012)]/ oneiric main restricted deb-src http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/ precise main restricted #Added by software-properties # See http://help.ubuntu.com/community/UpgradeNotes for how to upgrade to # newer versions of the distribution. deb http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/ precise main restricted deb-src http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/ precise multiverse universe #Added by software-properties ## Major bug fix updates produced after the final release of the ## distribution. deb http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/ precise-updates main restricted deb-src http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/ precise-updates restricted main multiverse universe #Added by software-properties ## N.B. software from this repository is ENTIRELY UNSUPPORTED by the Ubuntu ## team. Also, please note that software in universe WILL NOT receive any ## review or updates from the Ubuntu security team. deb http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/ precise universe deb http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/ precise-updates universe ## N.B. software from this repository is ENTIRELY UNSUPPORTED by the Ubuntu ## team, and may not be under a free licence. Please satisfy yourself as to ## your rights to use the software. Also, please note that software in ## multiverse WILL NOT receive any review or updates from the Ubuntu ## security team. deb http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/ precise multiverse deb http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/ precise-updates multiverse ## N.B. software from this repository may not have been tested as ## extensively as that contained in the main release, although it includes ## newer versions of some applications which may provide useful features. ## Also, please note that software in backports WILL NOT receive any review ## or updates from the Ubuntu security team. deb http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/ precise-backports main restricted universe multiverse deb-src http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/ precise-backports main restricted universe multiverse #Added by software-properties deb http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/ precise-security main restricted deb-src http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/ precise-security restricted main multiverse universe #Added by software-properties deb http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/ precise-security universe deb http://ftp.u-picardie.fr/mirror/ubuntu/ubuntu/ precise-security multiverse ## Uncomment the following two lines to add software from Canonical's ## 'partner' repository. ## This software is not part of Ubuntu, but is offered by Canonical and the ## respective vendors as a service to Ubuntu users. # deb http://archive.canonical.com/ubuntu oneiric partner # deb-src http://archive.canonical.com/ubuntu oneiric partner ## This software is not part of Ubuntu, but is offered by third-party ## developers who want to ship their latest software. deb http://extras.ubuntu.com/ubuntu precise main deb-src http://extras.ubuntu.com/ubuntu precise main Testing Alternate sources.list file These are the steps I followed to produce the following output: Please backup your sources.list: sudo cp /etc/apt/sources.list /etc/apt/sources.list.backup and then replace the contents of /etc/apt/sources.list with the below lines and run apt-get update: deb http://archive.ubuntu.com/ubuntu/ precise main restricted universe multiverse deb http://archive.ubuntu.com/ubuntu/ precise-updates main restricted universe multiverse deb http://archive.ubuntu.com/ubuntu/ precise-backports main restricted universe multiverse deb http://security.ubuntu.com/ubuntu precise-security main restricted universe multiverse deb http://archive.canonical.com/ubuntu precise partner deb http://extras.ubuntu.com/ubuntu precise main Output: someone@someone-UBook:~$ sudo apt-get update Ign http://archive.canonical.com precise InRelease Hit http://archive.canonical.com precise Release.gpg Hit http://archive.canonical.com precise Release Ign http://archive.ubuntu.com precise InRelease Ign http://extras.ubuntu.com precise InRelease Ign http://archive.ubuntu.com precise-updates InRelease Hit http://archive.canonical.com precise/partner i386 Packages Hit http://extras.ubuntu.com precise Release.gpg Ign http://archive.ubuntu.com precise-backports InRelease Ign http://archive.canonical.com precise/partner TranslationIndex Err http://archive.canonical.com precise/partner Translation-en_US Unable to connect to archive.canonical.com:http: [IP: 91.189.92.150 80] Err http://archive.canonical.com precise/partner Translation-en Unable to connect to archive.canonical.com:http: [IP: 91.189.92.150 80] Hit http://extras.ubuntu.com precise Release Get:1 http://archive.ubuntu.com precise Release.gpg [198 B] Ign http://dl.google.com stable InRelease Err http://dl.google.com stable InRelease Err http://dl.google.com stable Release.gpg Unable to connect to dl.google.com:http: [IP: 173.194.34.38 80] Err http://dl.google.com stable Release.gpg Unable to connect to dl.google.com:http: [IP: 173.194.34.38 80] Get:2 http://archive.ubuntu.com precise-updates Release.gpg [198 B] Hit http://extras.ubuntu.com precise/main i386 Packages Get:3 http://archive.ubuntu.com precise-backports Release.gpg [198 B] Ign http://security.ubuntu.com precise-security InRelease Ign http://extras.ubuntu.com precise/main TranslationIndex Err http://extras.ubuntu.com precise/main Translation-en_US Unable to connect to extras.ubuntu.com:http: Err http://extras.ubuntu.com precise/main Translation-en Unable to connect to extras.ubuntu.com:http: Get:4 http://security.ubuntu.com precise-security Release.gpg [198 B] Get:5 http://archive.ubuntu.com precise Release [49.6 kB] Get:6 http://security.ubuntu.com precise-security Release [49.6 kB] Get:7 http://archive.ubuntu.com precise-updates Release [49.6 kB] Get:8 http://archive.ubuntu.com precise-backports Release [49.6 kB] Get:9 http://security.ubuntu.com precise-security/main i386 Packages [32.9 kB] Get:10 http://archive.ubuntu.com precise/main i386 Packages [1,274 kB] Get:11 http://security.ubuntu.com precise-security/restricted i386 Packages [14 B] Get:12 http://security.ubuntu.com precise-security/universe i386 Packages [8,594 B] Get:13 http://security.ubuntu.com precise-security/multiverse i386 Packages [1,393 B] Get:14 http://security.ubuntu.com precise-security/main TranslationIndex [73 B] Get:15 http://security.ubuntu.com precise-security/multiverse TranslationIndex [71 B] Get:16 http://security.ubuntu.com precise-security/restricted TranslationIndex [70 B] Get:17 http://security.ubuntu.com precise-security/universe TranslationIndex [72 B] Get:18 http://security.ubuntu.com precise-security/main Translation-en [13.6 kB] Get:19 http://security.ubuntu.com precise-security/multiverse Translation-en [587 B] Get:20 http://security.ubuntu.com precise-security/restricted Translation-en [14 B] Get:21 http://security.ubuntu.com precise-security/universe Translation-en [6,261 B] Get:22 http://archive.ubuntu.com precise/restricted i386 Packages [8,431 B] Get:23 http://archive.ubuntu.com precise/universe i386 Packages [4,796 kB] Ign http://ppa.launchpad.net precise InRelease Ign http://ppa.launchpad.net precise InRelease Ign http://ppa.launchpad.net precise InRelease Ign http://ppa.launchpad.net precise InRelease Ign http://ppa.launchpad.net precise InRelease Ign http://ppa.launchpad.net precise InRelease Ign http://ppa.launchpad.net precise InRelease Ign http://ppa.launchpad.net precise InRelease Ign http://ppa.launchpad.net precise InRelease Ign http://ppa.launchpad.net precise InRelease Ign http://ppa.launchpad.net precise InRelease Get:24 http://ppa.launchpad.net precise Release.gpg [316 B] Get:25 http://ppa.launchpad.net precise Release.gpg [316 B] Get:26 http://ppa.launchpad.net precise Release.gpg [316 B] Ign http://ppa.launchpad.net precise Release.gpg Get:27 http://ppa.launchpad.net precise Release.gpg [316 B] Hit http://ppa.launchpad.net precise Release.gpg Get:28 http://ppa.launchpad.net precise Release.gpg [316 B] Get:29 http://ppa.launchpad.net precise Release.gpg [316 B] Hit http://ppa.launchpad.net precise Release.gpg Get:30 http://ppa.launchpad.net precise Release.gpg [316 B] Hit http://ppa.launchpad.net precise Release.gpg Get:31 http://ppa.launchpad.net precise Release [11.9 kB] Get:32 http://ppa.launchpad.net precise Release [11.9 kB] Get:33 http://archive.ubuntu.com precise/multiverse i386 Packages [121 kB] Get:34 http://ppa.launchpad.net precise Release [11.9 kB] Ign http://ppa.launchpad.net precise Release Get:35 http://ppa.launchpad.net precise Release [11.9 kB] Hit http://archive.ubuntu.com precise/main TranslationIndex Hit http://archive.ubuntu.com precise/multiverse TranslationIndex Hit http://ppa.launchpad.net precise Release Hit http://archive.ubuntu.com precise/restricted TranslationIndex Get:36 http://ppa.launchpad.net precise Release [11.9 kB] Hit http://archive.ubuntu.com precise/universe TranslationIndex Get:37 http://ppa.launchpad.net precise Release [11.9 kB] Get:38 http://archive.ubuntu.com precise-updates/main i386 Packages [96.5 kB] Hit http://ppa.launchpad.net precise Release Get:39 http://ppa.launchpad.net precise Release [11.9 kB] Get:40 http://archive.ubuntu.com precise-updates/restricted i386 Packages [770 B] Hit http://ppa.launchpad.net precise Release Get:41 http://archive.ubuntu.com precise-updates/universe i386 Packages [27.7 kB] Get:42 http://ppa.launchpad.net precise/main Sources [524 B] Get:43 http://archive.ubuntu.com precise-updates/multiverse i386 Packages [1,393 B] Get:44 http://ppa.launchpad.net precise/main i386 Packages [507 B] Hit http://archive.ubuntu.com precise-updates/main TranslationIndex Ign http://ppa.launchpad.net precise/main TranslationIndex Hit http://archive.ubuntu.com precise-updates/multiverse TranslationIndex Hit http://archive.ubuntu.com precise-updates/restricted TranslationIndex Get:45 http://ppa.launchpad.net precise/main Sources [932 B] Hit http://archive.ubuntu.com precise-updates/universe TranslationIndex Get:46 http://ppa.launchpad.net precise/main i386 Packages [1,017 B] Get:47 http://archive.ubuntu.com precise-backports/main i386 Packages [559 B] Ign http://ppa.launchpad.net precise/main TranslationIndex Get:48 http://archive.ubuntu.com precise-backports/restricted i386 Packages [14 B] Get:49 http://archive.ubuntu.com precise-backports/universe i386 Packages [1,391 B] Get:50 http://ppa.launchpad.net precise/main Sources [1,402 B] Get:51 http://archive.ubuntu.com precise-backports/multiverse i386 Packages [14 B] Hit http://archive.ubuntu.com precise-backports/main TranslationIndex Get:52 http://ppa.launchpad.net precise/main i386 Packages [1,605 B] Hit http://archive.ubuntu.com precise-backports/multiverse TranslationIndex Ign http://ppa.launchpad.net precise/main TranslationIndex Hit http://archive.ubuntu.com precise-backports/restricted TranslationIndex Hit http://archive.ubuntu.com precise-backports/universe TranslationIndex Hit http://archive.ubuntu.com precise/main Translation-en Ign http://ppa.launchpad.net precise/main TranslationIndex Hit http://archive.ubuntu.com precise/multiverse Translation-en Get:53 http://ppa.launchpad.net precise/main Sources [931 B] Hit http://archive.ubuntu.com precise/restricted Translation-en Get:54 http://ppa.launchpad.net precise/main i386 Packages [1,079 B] Hit http://archive.ubuntu.com precise/universe Translation-en Ign http://ppa.launchpad.net precise/main TranslationIndex Hit http://archive.ubuntu.com precise-updates/main Translation-en Hit http://ppa.launchpad.net precise/main Sources Hit http://archive.ubuntu.com precise-updates/multiverse Translation-en Hit http://ppa.launchpad.net precise/main i386 Packages Hit http://archive.ubuntu.com precise-updates/restricted Translation-en Ign http://ppa.launchpad.net precise/main TranslationIndex Hit http://archive.ubuntu.com precise-updates/universe Translation-en Get:55 http://ppa.launchpad.net precise/main Sources [3,611 B] Hit http://archive.ubuntu.com precise-backports/main Translation-en Get:56 http://ppa.launchpad.net precise/main i386 Packages [2,468 B] Hit http://archive.ubuntu.com precise-backports/multiverse Translation-en Ign http://ppa.launchpad.net precise/main TranslationIndex Hit http://archive.ubuntu.com precise-backports/restricted Translation-en Hit http://archive.ubuntu.com precise-backports/universe Translation-en Get:57 http://ppa.launchpad.net precise/main Sources [1,524 B] Get:58 http://ppa.launchpad.net precise/main i386 Packages [2,719 B] Ign http://ppa.launchpad.net precise/main TranslationIndex Hit http://ppa.launchpad.net precise/main Sources Hit http://ppa.launchpad.net precise/main i386 Packages Ign http://ppa.launchpad.net precise/main TranslationIndex Get:59 http://ppa.launchpad.net precise/main Sources [1,052 B] Get:60 http://ppa.launchpad.net precise/main i386 Packages [1,388 B] Ign http://ppa.launchpad.net precise/main TranslationIndex Get:61 http://ppa.launchpad.net precise/main Sources [1,185 B] Get:62 http://ppa.launchpad.net precise/main i386 Packages [1,698 B] Ign http://ppa.launchpad.net precise/main TranslationIndex Err http://ppa.launchpad.net precise/main Sources 404 Not Found Err http://ppa.launchpad.net precise/main i386 Packages 404 Not Found Ign http://ppa.launchpad.net precise/main Translation-en_US Ign http://ppa.launchpad.net precise/main Translation-en Ign http://ppa.launchpad.net precise/main Translation-en_US Ign http://ppa.launchpad.net precise/main Translation-en Ign http://ppa.launchpad.net precise/main Translation-en_US Ign http://ppa.launchpad.net precise/main Translation-en Ign http://ppa.launchpad.net precise/main Translation-en_US Ign http://ppa.launchpad.net precise/main Translation-en Ign http://ppa.launchpad.net precise/main Translation-en_US Ign http://ppa.launchpad.net precise/main Translation-en Ign http://ppa.launchpad.net precise/main Translation-en_US Ign http://ppa.launchpad.net precise/main Translation-en Ign http://ppa.launchpad.net precise/main Translation-en_US Ign http://ppa.launchpad.net precise/main Translation-en Ign http://ppa.launchpad.net precise/main Translation-en_US Ign http://ppa.launchpad.net precise/main Translation-en Ign http://ppa.launchpad.net precise/main Translation-en_US Ign http://ppa.launchpad.net precise/main Translation-en Ign http://ppa.launchpad.net precise/main Translation-en_US Ign http://ppa.launchpad.net precise/main Translation-en Ign http://ppa.launchpad.net precise/main Translation-en_US Ign http://ppa.launchpad.net precise/main Translation-en Fetched 6,699 kB in 15s (445 kB/s) Reading package lists... Done W: Failed to fetch http://dl.google.com/linux/talkplugin/deb/dists/stable/InRelease W: Failed to fetch http://archive.canonical.com/ubuntu/dists/precise/partner/i18n/Translation-en_US Unable to connect to archive.canonical.com:http: [IP: 91.189.92.150 80] W: Failed to fetch http://archive.canonical.com/ubuntu/dists/precise/partner/i18n/Translation-en Unable to connect to archive.canonical.com:http: [IP: 91.189.92.150 80] W: Failed to fetch http://dl.google.com/linux/chrome/deb/dists/sta

    Read the article

  • Quick guide to Oracle IRM 11g: Server configuration

    - by Simon Thorpe
    Quick guide to Oracle IRM 11g index Welcome to the second article in this quick quide to Oracle IRM 11g. Hopefully you've just finished the first article which takes you through deploying the software onto a Linux server. This article walks you through the configuration of this new service and contains a subset of information from the official documentation and is focused on installing the server on Oracle Enterprise Linux. If you are planning to deploy on a non-Linux platform, you will need to reference the documentation for platform specific information. Contents Introduction Create IRM WebLogic Domain Starting the Admin Server and initial configuration Introduction In the previous article the database was prepared, the WebLogic Application Server installed and the files required for an IRM server installed. But we don't actually have a configured system yet. We need to now create a WebLogic Domain in which the IRM server will run, then configure some of the settings and crypography so that we can create a context and be ready to seal some content and test it all works. This article doesn't cover the configuration of SSL communication from client to server. This is quite a big topic and a separate article has been dedicated for this area. In these articles I also use the hostname, irm.company.internal to reference the IRM server and later on use the hostname irm.company.com in reference to the public facing service. Create IRM WebLogic Domain First step is creating the WebLogic domain, in a console switch to the newly created IRM installation folder as shown below and we will run the domain configuration wizard. [oracle@irm /]$ cd /oracle/middleware/Oracle_IRM/common/bin [oracle@irm bin]$ ./config.sh First thing the wizard will ask is if you wish to create a new or extend an existing domain. This guide is creating a standalone system so you should select to create a new domain. Next step is to choose what technologies from the Oracle ECM Suite you wish this domain to host. You are only interested in selecting the option "Oracle Information Rights Management". When you select this check box you will notice that it also selects "Oracle Enterprise Manager" and "Oracle JRF" as these are dependencies of the IRM server. You then need to specify where you wish to place the domain files. I usually just change the domain name from base_domain or irm_domain and leave the others with their defaults. Now the domain will have a single user initially and by default this user is called "weblogic". I usually change this account name to "sysadmin" or "administrator", but in this guide lets just accept the default. With respects to the next dialog, again for eval or dev reasons, leave the server startup mode as development. The JDK should also be automatically detected. We now need to provide details of the database. This guide is using the Oracle 11gR2 database and the settings I used can be seen in the image to the right. There is a lot of configuration that can now be done for the admin server, any managed servers and where the deployments reside. In this guide I am leaving all of these to their defaults so do not check any of the boxes. However I will on this blog be detailing later how you can go back and setup things such as automated startup of an IRM server which require changes to these default settings. But for now, lets leave it all alone and just click next. Now we are ready to install. Note that from this dialog you can scroll the left window and see there are going to be two servers created from the defaults. The AdminServer which is where you modify settings for the WebLogic Server and also hosts the Oracle Enterprise Manager for IRM which allows to monitor the IRM service performance and also make service related settings (which we shortly do below) and the IRM_server1 which hosts the actual IRM services themselves. So go right ahead and hit create, the process is pretty quick and usually under 10 minutes. When the domain creation ends, it will give you the URL to the admin server. It's worth noting this down and the URL is usually; http://irm.company.internal:7001 Starting the Admin Server and initial configuration First thing to do is to start the WebLogic Admin server and review the initial IRM server settings. In this guide we are going to run the Admin server and IRM server in console windows, in another article I will discuss running these as background services. So for now, start a console and run the Admin server by doing the following. cd /oracle/middleware/user_projects/domains/irm_domain/ ./startWebLogic.sh Wait for the server to start, you are looking for the following line to be reported in the console window. <BEA-00360><Server started in RUNNING mode> First step is configuring the IRM service via Enterprise Manager. Now that the Admin server is running you can point a browser at http://irm.company.internal:7001/em. Login with the username and password you supplied when you created the domain. In Enterprise Manager the IRM service administrator is able to make server wide configuration. However finding where to access the pages with these settings can be a bit of a challenge. After logging in on the left you'll see a tree containing elements of the Enterprise Manager farm Farm_irm_domain. Open up Content Management, then Information Rights Management and finally select the IRM node. On the right then select the IRM menu item, navigate to the Administration section and now we have four options, for now, we are just going to look at General Settings. The image on the right proves that a picture is worth a thousand words (or 113 in this case). The General Settings page allows you to set the cryptographic algorithms used for protecting sealed content. Unless you have a burning need to increase the key lengths or you need to comply to a regulation or government mandate, AES192 is a good start. You can change this later on without worry. The most important setting here we need to make is the Server URL. In this blog article I go over why this URL is so important, basically every single piece of content you protect with Oracle IRM is going to have this URL embedded in it, so if it's wrong or unresolvable, then nobody can open the secured documents. Note that in our environment we have yet to do any SSL configuration of the service. If you intend to build a server without SSL, then use http as the protocol instead of https. But I would recommend using SSL and setting this up is described in the next article. I would also probably up the device count from 1 to 3. This means that any user can retrieve rights to access content onto 3 computers at any one time. The default of 1 doesn't really make sense in development, evaluation nor even production environments and my experience is that 3 is a better number. Next step is to create the keystore for the IRM server. When a classification (called a context) is created, Oracle IRM generates a unique set of symmetric keys which are used to secure the content itself. These keys are then encrypted with a set of "wrapper" asymmetric cryptography keys which are stored externally to the server either in a Java Key Store or a HSM. These keys need to be generated and the following shows my commands and the resulting output. I have greyed out the responses from the commands so you can see the input a little easier. [oracle@irmsrv ~]$ cd /oracle/middleware/wlserver_10.3/server/bin/ [oracle@irmsrv bin]$ ./setWLSEnv.sh CLASSPATH=/oracle/middleware/patch_wls1033/profiles/default/sys_manifest_classpath/weblogic_patch.jar:/oracle/middleware/patch_ocp353/profiles/default/sys_manifest_classpath/weblogic_patch.jar:/usr/java/jdk1.6.0_18/lib/tools.jar:/oracle/middleware/wlserver_10.3/server/lib/weblogic_sp.jar:/oracle/middleware/wlserver_10.3/server/lib/weblogic.jar:/oracle/middleware/modules/features/weblogic.server.modules_10.3.3.0.jar:/oracle/middleware/wlserver_10.3/server/lib/webservices.jar:/oracle/middleware/modules/org.apache.ant_1.7.1/lib/ant-all.jar:/oracle/middleware/modules/net.sf.antcontrib_1.1.0.0_1-0b2/lib/ant-contrib.jar: PATH=/oracle/middleware/wlserver_10.3/server/bin:/oracle/middleware/modules/org.apache.ant_1.7.1/bin:/usr/java/jdk1.6.0_18/jre/bin:/usr/java/jdk1.6.0_18/bin:/usr/kerberos/bin:/usr/local/bin:/bin:/usr/bin:/home/oracle/bin Your environment has been set. [oracle@irmsrv bin]$ cd /oracle/middleware/user_projects/domains/irm_domain/config/fmwconfig/ [oracle@irmsrv fmwconfig]$ keytool -genkeypair -alias oracle.irm.wrap -keyalg RSA -keysize 2048 -keystore irm.jks Enter keystore password: Re-enter new password: What is your first and last name? [Unknown]: Simon Thorpe What is the name of your organizational unit? [Unknown]: Oracle What is the name of your organization? [Unknown]: Oracle What is the name of your City or Locality? [Unknown]: San Francisco What is the name of your State or Province? [Unknown]: CA What is the two-letter country code for this unit? [Unknown]: US Is CN=Simon Thorpe, OU=Oracle, O=Oracle, L=San Francisco, ST=CA, C=US correct? [no]: yes Enter key password for (RETURN if same as keystore password): At this point we now have an irm.jks in the directory /oracle/middleware/user_projects/domains/irm_domain/config/fmwconfig. The reason we store it here is this folder would be backed up as part of a domain backup. As with any cryptographic technology, DO NOT LOSE THESE KEYS OR THIS KEY STORE. Once you've sealed content against a context, the keys will be wrapped with these keys, lose these keys, and you can't get access to any secured content, pretty important. Now we've got the keys created, we need to go back to the IRM Enterprise Manager and set the location of the key store. Going back to the General Settings page in Enterprise Manager scroll down to Keystore Settings. Leave the type as JKS but change the location to; /oracle/Middleware/user_projects/domains/irm_domain/config/fmwconfig/irm.jks and hit Apply. The final step with regards to the key store is we need to tell the server what the password is for the Java Key Store so that it can be opened and the keys accessed. Once more fire up a console window and run these commands (again i've greyed out the clutter to see the commands easier). You will see dummy passed into the commands, this is because the command asks for a username, but in this instance we don't use one, hence the value dummy is passed and it isn't used. [oracle@irmsrv fmwconfig]$ cd /oracle/middleware/Oracle_IRM/common/bin/ [oracle@irmsrv bin]$ ./wlst.sh ... lots of settings fly by... Welcome to WebLogic Server Administration Scripting Shell Type help() for help on available commands wls:/offline>connect('weblogic','password','t3://irmsrv.us.oracle.com:7001') Connecting to t3://irmsrv.us.oracle.com:7001 with userid weblogic ... Successfully connected to Admin Server 'AdminServer' that belongs to domain 'irm_domain'. Warning: An insecure protocol was used to connect to the server. To ensure on-the-wire security, the SSL port or Admin port should be used instead. wls:/irm_domain/serverConfig>createCred("IRM","keystore:irm.jks","dummy","password") Location changed to domainRuntime tree. This is a read-only tree with DomainMBean as the root. For more help, use help(domainRuntime)wls:/irm_domain/serverConfig>createCred("IRM","key:irm.jks:oracle.irm.wrap","dummy","password") Already in Domain Runtime Tree wls:/irm_domain/serverConfig> At last we are now ready to fire up the IRM server itself. The domain creation created a managed server called IRM_server1 and we need to start this, use the following commands in a new console window. cd /oracle/middleware/user_projects/domains/irm_domain/bin/ ./startManagedWebLogic.sh IRM_server1 This will start up the server in the console, unlike the Admin server, you need to provide the username and password for the service to start. Enter in your weblogic username and password when prompted. You can change this behavior by putting the password into a boot.properties file, read more about this in the WebLogic Server documentation. Once running, wait until you see the line; <Notice><WebLogicServer><BEA-000360><Server started in RUNNING mode> At this point we can now login to the Oracle IRM Management Website at the URL. http://irm.company.internal:1600/irm_rights/ The server is just configured for HTTP at the moment, no SSL involved. Just want to ensure we can get a working system up and running. You should now see a login like the image on the right and you can now login using your weblogic username and password. The next article in this guide goes over adding SSL and now testing your server by actually adding a few users, sealing some content and opening this content as a user.

    Read the article

  • Metro: Declarative Data Binding

    - by Stephen.Walther
    The goal of this blog post is to describe how declarative data binding works in the WinJS library. In particular, you learn how to use both the data-win-bind and data-win-bindsource attributes. You also learn how to use calculated properties and converters to format the value of a property automatically when performing data binding. By taking advantage of WinJS data binding, you can use the Model-View-ViewModel (MVVM) pattern when building Metro style applications with JavaScript. By using the MVVM pattern, you can prevent your JavaScript code from spinning into chaos. The MVVM pattern provides you with a standard pattern for organizing your JavaScript code which results in a more maintainable application. Using Declarative Bindings You can use the data-win-bind attribute with any HTML element in a page. The data-win-bind attribute enables you to bind (associate) an attribute of an HTML element to the value of a property. Imagine, for example, that you want to create a product details page. You want to show a product object in a page. In that case, you can create the following HTML page to display the product details: <!DOCTYPE html> <html> <head> <meta charset="utf-8"> <title>Application1</title> <!-- WinJS references --> <link href="//Microsoft.WinJS.0.6/css/ui-dark.css" rel="stylesheet"> <script src="//Microsoft.WinJS.0.6/js/base.js"></script> <script src="//Microsoft.WinJS.0.6/js/ui.js"></script> <!-- Application1 references --> <link href="/css/default.css" rel="stylesheet"> <script src="/js/default.js"></script> </head> <body> <h1>Product Details</h1> <div class="field"> Product Name: <span data-win-bind="innerText:name"></span> </div> <div class="field"> Product Price: <span data-win-bind="innerText:price"></span> </div> <div class="field"> Product Picture: <br /> <img data-win-bind="src:photo;alt:name" /> </div> </body> </html> The HTML page above contains three data-win-bind attributes – one attribute for each product property displayed. You use the data-win-bind attribute to set properties of the HTML element associated with the data-win-attribute. The data-win-bind attribute takes a semicolon delimited list of element property names and data source property names: data-win-bind=”elementPropertyName:datasourcePropertyName; elementPropertyName:datasourcePropertyName;…” In the HTML page above, the first two data-win-bind attributes are used to set the values of the innerText property of the SPAN elements. The last data-win-bind attribute is used to set the values of the IMG element’s src and alt attributes. By the way, using data-win-bind attributes is perfectly valid HTML5. The HTML5 standard enables you to add custom attributes to an HTML document just as long as the custom attributes start with the prefix data-. So you can add custom attributes to an HTML5 document with names like data-stephen, data-funky, or data-rover-dog-is-hungry and your document will validate. The product object displayed in the page above with the data-win-bind attributes is created in the default.js file: (function () { "use strict"; var app = WinJS.Application; app.onactivated = function (eventObject) { if (eventObject.detail.kind === Windows.ApplicationModel.Activation.ActivationKind.launch) { var product = { name: "Tesla", price: 80000, photo: "/images/TeslaPhoto.png" }; WinJS.Binding.processAll(null, product); } }; app.start(); })(); In the code above, a product object is created with a name, price, and photo property. The WinJS.Binding.processAll() method is called to perform the actual binding (Don’t confuse WinJS.Binding.processAll() and WinJS.UI.processAll() – these are different methods). The first parameter passed to the processAll() method represents the root element for the binding. In other words, binding happens on this element and its child elements. If you provide the value null, then binding happens on the entire body of the document (document.body). The second parameter represents the data context. This is the object that has the properties which are displayed with the data-win-bind attributes. In the code above, the product object is passed as the data context parameter. Another word for data context is view model.  Creating Complex View Models In the previous section, we used the data-win-bind attribute to display the properties of a simple object: a single product. However, you can use binding with more complex view models including view models which represent multiple objects. For example, the view model in the following default.js file represents both a customer and a product object. Furthermore, the customer object has a nested address object: (function () { "use strict"; var app = WinJS.Application; app.onactivated = function (eventObject) { if (eventObject.detail.kind === Windows.ApplicationModel.Activation.ActivationKind.launch) { var viewModel = { customer: { firstName: "Fred", lastName: "Flintstone", address: { street: "1 Rocky Way", city: "Bedrock", country: "USA" } }, product: { name: "Bowling Ball", price: 34.55 } }; WinJS.Binding.processAll(null, viewModel); } }; app.start(); })(); The following page displays the customer (including the customer address) and the product. Notice that you can use dot notation to refer to child objects in a view model such as customer.address.street. <!DOCTYPE html> <html> <head> <meta charset="utf-8"> <title>Application1</title> <!-- WinJS references --> <link href="//Microsoft.WinJS.0.6/css/ui-dark.css" rel="stylesheet"> <script src="//Microsoft.WinJS.0.6/js/base.js"></script> <script src="//Microsoft.WinJS.0.6/js/ui.js"></script> <!-- Application1 references --> <link href="/css/default.css" rel="stylesheet"> <script src="/js/default.js"></script> </head> <body> <h1>Customer Details</h1> <div class="field"> First Name: <span data-win-bind="innerText:customer.firstName"></span> </div> <div class="field"> Last Name: <span data-win-bind="innerText:customer.lastName"></span> </div> <div class="field"> Address: <address> <span data-win-bind="innerText:customer.address.street"></span> <br /> <span data-win-bind="innerText:customer.address.city"></span> <br /> <span data-win-bind="innerText:customer.address.country"></span> </address> </div> <h1>Product</h1> <div class="field"> Name: <span data-win-bind="innerText:product.name"></span> </div> <div class="field"> Price: <span data-win-bind="innerText:product.price"></span> </div> </body> </html> A view model can be as complicated as you need and you can bind the view model to a view (an HTML document) by using declarative bindings. Creating Calculated Properties You might want to modify a property before displaying the property. For example, you might want to format the product price property before displaying the property. You don’t want to display the raw product price “80000”. Instead, you want to display the formatted price “$80,000”. You also might need to combine multiple properties. For example, you might need to display the customer full name by combining the values of the customer first and last name properties. In these situations, it is tempting to call a function when performing binding. For example, you could create a function named fullName() which concatenates the customer first and last name. Unfortunately, the WinJS library does not support the following syntax: <span data-win-bind=”innerText:fullName()”></span> Instead, in these situations, you should create a new property in your view model that has a getter. For example, the customer object in the following default.js file includes a property named fullName which combines the values of the firstName and lastName properties: (function () { "use strict"; var app = WinJS.Application; app.onactivated = function (eventObject) { if (eventObject.detail.kind === Windows.ApplicationModel.Activation.ActivationKind.launch) { var customer = { firstName: "Fred", lastName: "Flintstone", get fullName() { return this.firstName + " " + this.lastName; } }; WinJS.Binding.processAll(null, customer); } }; app.start(); })(); The customer object has a firstName, lastName, and fullName property. Notice that the fullName property is defined with a getter function. When you read the fullName property, the values of the firstName and lastName properties are concatenated and returned. The following HTML page displays the fullName property in an H1 element. You can use the fullName property in a data-win-bind attribute in exactly the same way as any other property. <!DOCTYPE html> <html> <head> <meta charset="utf-8"> <title>Application1</title> <!-- WinJS references --> <link href="//Microsoft.WinJS.0.6/css/ui-dark.css" rel="stylesheet"> <script src="//Microsoft.WinJS.0.6/js/base.js"></script> <script src="//Microsoft.WinJS.0.6/js/ui.js"></script> <!-- Application1 references --> <link href="/css/default.css" rel="stylesheet"> <script src="/js/default.js"></script> </head> <body> <h1 data-win-bind="innerText:fullName"></h1> <div class="field"> First Name: <span data-win-bind="innerText:firstName"></span> </div> <div class="field"> Last Name: <span data-win-bind="innerText:lastName"></span> </div> </body> </html> Creating a Converter In the previous section, you learned how to format the value of a property by creating a property with a getter. This approach makes sense when the formatting logic is specific to a particular view model. If, on the other hand, you need to perform the same type of formatting for multiple view models then it makes more sense to create a converter function. A converter function is a function which you can apply whenever you are using the data-win-bind attribute. Imagine, for example, that you want to create a general function for displaying dates. You always want to display dates using a short format such as 12/25/1988. The following JavaScript file – named converters.js – contains a shortDate() converter: (function (WinJS) { var shortDate = WinJS.Binding.converter(function (date) { return date.getMonth() + 1 + "/" + date.getDate() + "/" + date.getFullYear(); }); // Export shortDate WinJS.Namespace.define("MyApp.Converters", { shortDate: shortDate }); })(WinJS); The file above uses the Module Pattern, a pattern which is used through the WinJS library. To learn more about the Module Pattern, see my blog entry on namespaces and modules: http://stephenwalther.com/blog/archive/2012/02/22/windows-web-applications-namespaces-and-modules.aspx The file contains the definition for a converter function named shortDate(). This function converts a JavaScript date object into a short date string such as 12/1/1988. The converter function is created with the help of the WinJS.Binding.converter() method. This method takes a normal function and converts it into a converter function. Finally, the shortDate() converter is added to the MyApp.Converters namespace. You can call the shortDate() function by calling MyApp.Converters.shortDate(). The default.js file contains the customer object that we want to bind. Notice that the customer object has a firstName, lastName, and birthday property. We will use our new shortDate() converter when displaying the customer birthday property: (function () { "use strict"; var app = WinJS.Application; app.onactivated = function (eventObject) { if (eventObject.detail.kind === Windows.ApplicationModel.Activation.ActivationKind.launch) { var customer = { firstName: "Fred", lastName: "Flintstone", birthday: new Date("12/1/1988") }; WinJS.Binding.processAll(null, customer); } }; app.start(); })(); We actually use our shortDate converter in the HTML document. The following HTML document displays all of the customer properties: <!DOCTYPE html> <html> <head> <meta charset="utf-8"> <title>Application1</title> <!-- WinJS references --> <link href="//Microsoft.WinJS.0.6/css/ui-dark.css" rel="stylesheet"> <script src="//Microsoft.WinJS.0.6/js/base.js"></script> <script src="//Microsoft.WinJS.0.6/js/ui.js"></script> <!-- Application1 references --> <link href="/css/default.css" rel="stylesheet"> <script src="/js/default.js"></script> <script type="text/javascript" src="js/converters.js"></script> </head> <body> <h1>Customer Details</h1> <div class="field"> First Name: <span data-win-bind="innerText:firstName"></span> </div> <div class="field"> Last Name: <span data-win-bind="innerText:lastName"></span> </div> <div class="field"> Birthday: <span data-win-bind="innerText:birthday MyApp.Converters.shortDate"></span> </div> </body> </html> Notice the data-win-bind attribute used to display the birthday property. It looks like this: <span data-win-bind="innerText:birthday MyApp.Converters.shortDate"></span> The shortDate converter is applied to the birthday property when the birthday property is bound to the SPAN element’s innerText property. Using data-win-bindsource Normally, you pass the view model (the data context) which you want to use with the data-win-bind attributes in a page by passing the view model to the WinJS.Binding.processAll() method like this: WinJS.Binding.processAll(null, viewModel); As an alternative, you can specify the view model declaratively in your markup by using the data-win-datasource attribute. For example, the following default.js script exposes a view model with the fully-qualified name of MyWinWebApp.viewModel: (function () { "use strict"; var app = WinJS.Application; app.onactivated = function (eventObject) { if (eventObject.detail.kind === Windows.ApplicationModel.Activation.ActivationKind.launch) { // Create view model var viewModel = { customer: { firstName: "Fred", lastName: "Flintstone" }, product: { name: "Bowling Ball", price: 12.99 } }; // Export view model to be seen by universe WinJS.Namespace.define("MyWinWebApp", { viewModel: viewModel }); // Process data-win-bind attributes WinJS.Binding.processAll(); } }; app.start(); })(); In the code above, a view model which represents a customer and a product is exposed as MyWinWebApp.viewModel. The following HTML page illustrates how you can use the data-win-bindsource attribute to bind to this view model: <!DOCTYPE html> <html> <head> <meta charset="utf-8"> <title>Application1</title> <!-- WinJS references --> <link href="//Microsoft.WinJS.0.6/css/ui-dark.css" rel="stylesheet"> <script src="//Microsoft.WinJS.0.6/js/base.js"></script> <script src="//Microsoft.WinJS.0.6/js/ui.js"></script> <!-- Application1 references --> <link href="/css/default.css" rel="stylesheet"> <script src="/js/default.js"></script> </head> <body> <h1>Customer Details</h1> <div data-win-bindsource="MyWinWebApp.viewModel.customer"> <div class="field"> First Name: <span data-win-bind="innerText:firstName"></span> </div> <div class="field"> Last Name: <span data-win-bind="innerText:lastName"></span> </div> </div> <h1>Product</h1> <div data-win-bindsource="MyWinWebApp.viewModel.product"> <div class="field"> Name: <span data-win-bind="innerText:name"></span> </div> <div class="field"> Price: <span data-win-bind="innerText:price"></span> </div> </div> </body> </html> The data-win-bindsource attribute is used twice in the page above: it is used with the DIV element which contains the customer details and it is used with the DIV element which contains the product details. If an element has a data-win-bindsource attribute then all of the child elements of that element are affected. The data-win-bind attributes of all of the child elements are bound to the data source represented by the data-win-bindsource attribute. Summary The focus of this blog entry was data binding using the WinJS library. You learned how to use the data-win-bind attribute to bind the properties of an HTML element to a view model. We also discussed several advanced features of data binding. We examined how to create calculated properties by including a property with a getter in your view model. We also discussed how you can create a converter function to format the value of a view model property when binding the property. Finally, you learned how to use the data-win-bindsource attribute to specify a view model declaratively.

    Read the article

  • Q1 2010 New Feature: Paging with RadGridView for Silverlight and WPF

    We are glad to announce that the Q1 2010 Release has added another weapon to RadGridViews growing arsenal of features. This is the brand new RadDataPager control which provides the user interface for paging through a collection of data. The good news is that RadDataPager can be used to page any collection. It does not depend on RadGridView in any way, so you will be free to use it with the rest of your ItemsControls if you chose to do so. Before you read on, you might want to download the samples solution that I have attached. It contains a sample project for every scenario that I will discuss later on. Looking at the code while reading will make things much easier for you. There is something for everyone among the 10 Visual Studio projects that are included in the solution. So go and grab it. I. Paging essentials The single most important piece of software concerning paging in Silverlight is the System.ComponentModel.IPagedCollectionView interface. Those of you who are on the WPF front need not worry though. As you might already know, Teleriks Silverlight and WPF controls is share the same code-base. Since WPF does not contain a similar interface, Telerik has provided its own Telerik.Windows.Data.IPagedCollectionView. The IPagedCollectionView interface contains several important members which are used by RadGridView to perform the actual paging. Silverlight provides a default implementation of this interface which, naturally, is called PagedCollectionView. You should definitely take a look at its source code in case you are interested in what is going on under the hood. But this is not a prerequisite for our discussion. The WPF default implementation of the interface is Teleriks QueryableCollectionView which, among many other interfaces, implements IPagedCollectionView. II. No Paging In order to gradually build up my case, I will start with a very simple example that lacks paging whatsoever. It might sound stupid, but this will help us build on top of this paging-devoid example. Let us imagine that we have the simplest possible scenario. That is a simple IEnumerable and an ItemsControl that shows its contents. This will look like this: No Paging IEnumerable itemsSource = Enumerable.Range(0, 1000); this.itemsControl.ItemsSource = itemsSource; XAML <Border Grid.Row="0" BorderBrush="Black" BorderThickness="1" Margin="5">     <ListBox Name="itemsControl"/> </Border> <Border Grid.Row="1" BorderBrush="Black" BorderThickness="1" Margin="5">     <TextBlock Text="No Paging"/> </Border> Nothing special for now. Just some data displayed in a ListBox. The two sample projects in the solution that I have attached are: NoPaging_WPF NoPaging_SL3 With every next sample those two project will evolve in some way or another. III. Paging simple collections The single most important property of RadDataPager is its Source property. This is where you pass in your collection of data for paging. More often than not your collection will not be an IPagedCollectionView. It will either be a simple List<T>, or an ObservableCollection<T>, or anything that is simply IEnumerable. Unless you had paging in mind when you designed your project, it is almost certain that your data source will not be pageable out of the box. So what are the options? III. 1. Wrapping the simple collection in an IPagedCollectionView If you look at the constructors of PagedCollectionView and QueryableCollectionView you will notice that you can pass in a simple IEnumerable as a parameter. Those two classes will wrap it and provide paging capabilities over your original data. In fact, this is what RadGridView does internally. It wraps your original collection in an QueryableCollectionView in order to easily perform many useful tasks such as filtering, sorting, and others, but in our case the most important one is paging. So let us start our series of examples with the most simplistic one. Imagine that you have a simple IEnumerable which is the source for an ItemsControl. Here is how to wrap it in order to enable paging: Silverlight IEnumerable itemsSource = Enumerable.Range(0, 1000); var pagedSource = new PagedCollectionView(itemsSource); this.radDataPager.Source = pagedSource; this.itemsControl.ItemsSource = pagedSource; WPF IEnumerable itemsSource = Enumerable.Range(0, 1000); var pagedSource = new QueryableCollectionView(itemsSource); this.radDataPager.Source = pagedSource; this.itemsControl.ItemsSource = pagedSource; XAML <Border Grid.Row="0"         BorderBrush="Black"         BorderThickness="1"         Margin="5">     <ListBox Name="itemsControl"/> </Border> <Border Grid.Row="1"         BorderBrush="Black"         BorderThickness="1"         Margin="5">     <telerikGrid:RadDataPager Name="radDataPager"                               PageSize="10"                              IsTotalItemCountFixed="True"                              DisplayMode="All"/> This will do the trick. It is quite simple, isnt it? The two sample projects in the solution that I have attached are: PagingSimpleCollectionWithWrapping_WPF PagingSimpleCollectionWithWrapping_SL3 III. 2. Binding to RadDataPager.PagedSource In case you do not like this approach there is a better one. When you assign an IEnumerable as the Source of a RadDataPager it will automatically wrap it in a QueryableCollectionView and expose it through its PagedSource property. From then on, you can attach any number of ItemsControls to the PagedSource and they will be automatically paged. Here is how to do this entirely in XAML: Using RadDataPager.PagedSource <Border Grid.Row="0"         BorderBrush="Black"         BorderThickness="1" Margin="5">     <ListBox Name="itemsControl"              ItemsSource="{Binding PagedSource, ElementName=radDataPager}"/> </Border> <Border Grid.Row="1"         BorderBrush="Black"         BorderThickness="1"         Margin="5">     <telerikGrid:RadDataPager Name="radDataPager"                               Source="{Binding ItemsSource}"                              PageSize="10"                              IsTotalItemCountFixed="True"                              DisplayMode="All"/> The two sample projects in the solution that I have attached are: PagingSimpleCollectionWithPagedSource_WPF PagingSimpleCollectionWithPagedSource_SL3 IV. Paging collections implementing IPagedCollectionView Those of you who are using WCF RIA Services should feel very lucky. After a quick look with Reflector or the debugger we can see that the DomainDataSource.Data property is in fact an instance of the DomainDataSourceView class. This class implements a handful of useful interfaces: ICollectionView IEnumerable INotifyCollectionChanged IEditableCollectionView IPagedCollectionView INotifyPropertyChanged Luckily, IPagedCollectionView is among them which lets you do the whole paging in the server. So lets do this. We will add a DomainDataSource control to our page/window and connect the items control and the pager to it. Here is how to do this: MainPage <riaControls:DomainDataSource x:Name="invoicesDataSource"                               AutoLoad="True"                               QueryName="GetInvoicesQuery">     <riaControls:DomainDataSource.DomainContext>         <services:ChinookDomainContext/>     </riaControls:DomainDataSource.DomainContext> </riaControls:DomainDataSource> <Border Grid.Row="0"         BorderBrush="Black"         BorderThickness="1"         Margin="5">     <ListBox Name="itemsControl"              ItemsSource="{Binding Data, ElementName=invoicesDataSource}"/> </Border> <Border Grid.Row="1"         BorderBrush="Black"         BorderThickness="1"         Margin="5">     <telerikGrid:RadDataPager Name="radDataPager"                               Source="{Binding Data, ElementName=invoicesDataSource}"                              PageSize="10"                              IsTotalItemCountFixed="True"                              DisplayMode="All"/> By the way, you can replace the ListBox from the above code snippet with any other ItemsControl. It can be RadGridView, it can be the MS DataGrid, you name it. Essentially, RadDataPager is sending paging commands to the the DomainDataSource.Data. It does not care who, what, or how many different controls are bound to this same Data property of the DomainDataSource control. So if you would like to experiment with this, you can throw in any number of other ItemsControls next to the ListBox, bind them in the same manner, and all of them will be paged by our single RadDataPager. Furthermore, you can throw in any number of RadDataPagers and bind them to the same property. Then when you page with any one of them will automatically update all of the rest. The whole picture is simply beautiful and we can do all of this thanks to WCF RIA Services. The two sample projects (Silverlight only) in the solution that I have attached are: PagingIPagedCollectionView PagingIPagedCollectionView.Web IV. Paging RadGridView While you can replace the ListBox in any of the above examples with a RadGridView, RadGridView offers something extra. Similar to the DomainDataSource.Data property, the RadGridView.Items collection implements the IPagedCollectionView interface. So you are already thinking: Then why not bind the Source property of RadDataPager to RadGridView.Items? Well thats exactly what you can do and you will start paging RadGridView out-of-the-box. It is as simple as that, no code-behind is involved: MainPage <Border Grid.Row="0"         BorderBrush="Black"         BorderThickness="1" Margin="5">     <telerikGrid:RadGridView Name="radGridView"                              ItemsSource="{Binding ItemsSource}"/> </Border> <Border Grid.Row="1"         BorderBrush="Black"         BorderThickness="1"         Margin="5">     <telerikGrid:RadDataPager Name="radDataPager"                               Source="{Binding Items, ElementName=radGridView}"                              PageSize="10"                              IsTotalItemCountFixed="True"                              DisplayMode="All"/> The two sample projects in the solution that I have attached are: PagingRadGridView_SL3 PagingRadGridView_WPF With this last example I think I have covered every possible paging combination. In case you would like to see an example of something that I have not covered, please let me know. Also, make sure you check out those great online examples: WCF RIA Services with DomainDataSource Paging Configurator Endless Paging Paging Any Collection Paging RadGridView Happy Paging! Download Full Source Code Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

< Previous Page | 587 588 589 590 591 592 593 594 595 596 597  | Next Page >