Search Results

Search found 7294 results on 292 pages for 'parameters'.

Page 113/292 | < Previous Page | 109 110 111 112 113 114 115 116 117 118 119 120  | Next Page >

  • Sinatra: rendering snippets (partials)

    - by Michael
    I'm following along with an O'Reilly book that's building a twitter clone with Sinatra. As Sinatra doesn't have 'partials' (in the way that Rails does), the author creates his own 'snippets' that work like partials. I understand that this is fairly common in Sinatra. Anyways, inside one of his snippets (see the first one below) he calls another snippet text_limiter_js (which is copied below). Text_limiter_js is basically a javascript function. If you look at the javascript function in text_limiter_js, you'll notice that it takes two parameters. I don't understand where these parameters are coming from because they're not getting passed in when text_limiter_js is rendered inside the other snippet. I'm not sure if I've given enough information/code for someone to help me understand this, but if you can, please explain. =snippet :'/snippets/text_limiter_js' %h2.comic What are you doing? %form{:method => 'post', :action => '/update'} %textarea.update.span-15#update{:name => 'status', :rows => 2, :onKeyDown => "text_limiter($('#update'), $('#counter'))"} .span-6 %span#counter 140 characters left .prepend-12 %input#button{:type => 'submit', :value => 'update'} text_limiter_js.haml :javascript function text_limiter(field,counter_field) { limit = 139; if (field.val().length > limit) field.val(field.val().substring(0, limit)); else counter_field.text(limit - field.val().length); }

    Read the article

  • (Java) Is there a type of object that can handle anything from primitives to arrays?

    - by Michael
    I'm pretty new to Java, so I'm hoping one of you guys knows how to do this. I'm having the user specify both the type and value of arguments, in any XML-like way, to be passed to methods that are external to my application. Example: javac myAppsName externalJavaClass methodofExternalClass [parameters] Of course, to find the proper method, we have to have the proper parameter types as the method may be overloaded and that's the only way to tell the difference between the different versions. Parameters are currently formatted in this manner: (type)value(/type), e.g. (int)71(/int) (string)This is my string that I'm passing as a parameter!(/string) I parse them, getting the constructor for whatever type is indicated, then execute that constructor by running its method, newInstance(<String value>), loading the new instance into an Object. This works fine and dandy, but as we all know, some methods take arrays, or even multi-dimensional arrays. I could handle the argument formatting like so: (array)(array)(int)0(/int)(int)1(/int)(/array)(array)(int)2(/int)(int)3(/int)(/array)(/array)... or perhaps even better... {{(int)0(/int)(int)1(/int)}{(int)2(/int)(int)3(/int)}}. The question is, how can this be implemented? Do I have to start wrapping everything in an Object[] array so I can handle primitives, etc. as argObj[0], but load an array as I normally would? (Unfortunately, I would have to make it an Object[][] array if I wanted to support two-dimensional arrays. This implementation wouldn't be very pretty.)

    Read the article

  • Camera crashes in android 4.1(API level 16)

    - by Lincy
    My application has a camera functionality. It works fine in all Android version but now when i tested in S3 it crashes. The error points to this line: Parameters parameters = mCamera.getParameters(); Could someone provide a solution for this? The log is below: ?:??: W/?(?): java.lang.NullPointerException ?:??: W/?(?): at com.stpl.snapshun.camera.CameraActivity.surfaceChanged(CameraActivity.java:313) ?:??: W/?(?): at android.view.SurfaceView.updateWindow(SurfaceView.java:554) ?:??: W/?(?): at android.view.SurfaceView.access$000(SurfaceView.java:81) ?:??: W/?(?): at android.view.SurfaceView$3.onPreDraw(SurfaceView.java:169) ?:??: W/?(?): at android.view.ViewTreeObserver.dispatchOnPreDraw(ViewTreeObserver.java:671) ?:??: W/?(?): at android.view.ViewRootImpl.performTraversals(ViewRootImpl.java:1818) ?:??: W/?(?): at android.view.ViewRootImpl.doTraversal(ViewRootImpl.java:998) ?:??: W/?(?): at android.view.ViewRootImpl$TraversalRunnable.run(ViewRootImpl.java:4212) ?:??: W/?(?): at android.view.Choreographer$CallbackRecord.run(Choreographer.java:725) ?:??: W/?(?): at android.view.Choreographer.doCallbacks(Choreographer.java:555) ?:??: W/?(?): at android.view.Choreographer.doFrame(Choreographer.java:525) ?:??: W/?(?): at android.view.Choreographer$FrameDisplayEventReceiver.run(Choreographer.java:711) ?:??: W/?(?): at android.os.Handler.handleCallback(Handler.java:615) ?:??: W/?(?): at android.os.Handler.dispatchMessage(Handler.java:92) ?:??: W/?(?): at android.os.Looper.loop(Looper.java:137) ?:??: W/?(?): at android.app.ActivityThread.main(ActivityThread.java:4745) ?:??: W/?(?): at java.lang.reflect.Method.invokeNative(Native Method) ?:??: W/?(?): at java.lang.reflect.Method.invoke(Method.java:511) ?:??: W/?(?): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:786) ?:??: W/?(?): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:553) ?:??: W/?(?): at dalvik.system.NativeStart.main(Native Method) Thanks in advance

    Read the article

  • Trying to overload + operator

    - by FrostyStraw
    I cannot for the life of me understand why this is not working. I am so confused. I have a class Person which has a data member age, and I just want to add two people so that it adds the ages. I don't know why this is so hard, but I'm looking for examples and I feel like everyone does something different, and for some reason NONE of them work. Sometimes the examples I see have two parameters, sometimes they only have one, sometimes the parameters are references to the object, sometimes they're not, sometimes they return an int, sometimes they return a Person object. Like..what is the most normal way to do it? class Person { public: int age; //std::string haircolor = "brown"; //std::string ID = "23432598"; Person(): age(19) {} Person operator+(Person&) { } }; Person operator+(Person &obj1, Person &obj2){ Person sum = obj1; sum += obj2; return sum; } I really feel like overloading a + operator should seriously be the easiest thing in the world except I DON'T KNOW WHAT I AM DOING. I don't know if I'm supposed to create the overload function inside the class, outside, if it makes a difference, why if I do it inside it only allows one parameter, I just honestly don't get it.

    Read the article

  • Dynamic Image Caching with Java

    - by zteater
    I have a servlet with an API that delivers images from GET requests. The servlet creates a data file of CAD commands based on the parameters of the GET request. This data file is then delivered to an image parser, which creates an image on the file system. The servlet reads the image and returns the bytes on the response. All of the IO and the calling of the image parser program can be very taxing and images of around 80kb are rendering in 3-4000ms on a local system. There are roughly 20 parameters that make up the GET request. Each correlates to a different portion of the image. So, the combinations of possible images is extremely large. To alleviate the loading time, I plan to store BLOBs of rendered images in a database. If a GET request matches one previously executed, I will pull from cache. Else, I will render a new one. This does not fix "first-time" run, but will help "n+1 runs". Any other ideas on how I can improve performance?

    Read the article

  • How can I call these urls in jquery to display content on one page?

    - by Thorbis Website Design
    ok I figured out the jquery part but not the parameters of them all can anyone help figure out the parameters for each url string? this is the jquery I figured out! also would this work better then what the below answer? $.get('adminajax.php', {'action':'getUsers'}, function(data){ $('#users .users').html(data); }); He sent me this in an email: You can specify a page by adding: p=[page #] You can specify a file and it will add a checkbox next to the user which will be checked if the user has permission to download: file=[file location] adminajax.php?action=createDirectory&directory=[new directory location] adminajax.php?action=setAvailability&user=[username]&file=[filelocation]&available=[true or false] I'm trying to get it to display in these html tags: <div id="files"> <b>Files:</b> <ul class="files"></ul> </div> <div id="file_options"> <b>Options:</b> </div> <div id="users"> <b>Users:</b> <ul class="users"></ul> </div>

    Read the article

  • Using an alternate search platform in Commerce Server 2009

    - by Lewis Benge
    Although Microsoft Commerce Server 2009's architecture is built upon Microsoft SQL Server, and has the full power of the SQL Full Text Indexing Search Platform, there are time however when you may require a richer or alternate search platform. One of these scenarios if when you want to implement a faceted (refinement) search into your site, which provides dynamic refinements based on the search results dataset. Faceted search is becoming popular in most online retail environments as a way of providing an enhanced user experience when browsing a larger catalogue. This is powerful for two reasons, firstly with a traditional search it is down to a user to think of a search term suitable for the product they are trying to find. This typically will not return similar products or help in any way to refine a larger dataset. Faceted searches on the other hand provide a comprehensive list of product properties, grouped together by similarity to help the user narrow down the results returned, as the user progressively restricts the search criteria by selecting additional criteria to search again, these facets needs to continually refresh. The whole experience allows users to explore alternate brands, price-ranges, or find products they hadn't initially thought of or where looking for in a bid to enhance cross sell in the retail environment. The second advantage of this type of search from a business perspective is also to harvest the search result to start to profile your user. Even though anonymous users may routinely visit your site, and will not necessarily register or complete a transaction to build up marketing data- profiling, you can still achieve the same result by recording search facets used within the search sequence. Below is a faceted search scenario generated from eBay using the search term "server". By creating a search profile of clicking through Computer & Networking -> Servers -> Dell - > New and recording this information against my user profile you can start to predict with a lot more certainty what types of products I am interested in. This will allow you to apply shopping-cart analysis against your search data and provide great cross-sale or advertising opportunity, or personalise the user experience based on your prediction of what the user may be interested in. This type of search is extremely beneficial in e-Commerce environments but achieving it out of the box with Commerce Server and SQL Full Text indexing can be challenging. In many deployments it is often easier to use an alternate search platform such as Microsoft's FAST, Apache SOLR, or Endecca, however you still want these products to integrate natively into Commerce Server to ensure that up-to-date inventory information is presented, profile information is generated, and you provide a consistant API. To do so we make the most of the Commerce Server extensibilty points called operation sequence components. In this example I will be talking about Apache Solr hosted on Apache Tomcat, in this specific example I have used the SolrNet C# library to interface to the Java platform. Also I am not going to talk about Solr configuration of indexing – but in a production envionrment this would typically happen by using Powershell to call the Commerce Server management webservice to export your catalog as XML, apply an XSLT transform to the file to make it conform to SOLR and use a simple HTTP Post to send it to the search enginge for indexing. Essentially a sequance component is a step in a serial workflow used to call a data repository (which in most cases is usually the Commerce Server pipelines or databases) and map to and from a Commerce Entity object whilst enforcing any business rules. So the first step in the process is to add a new class library to your existing Commerce Server site. You will need to use a new library as Sequence Components will need to be strongly named to be deployed. Once you are inside of your new project, add a new class file and add a reference to the Microsoft.Commerce.Providers, Microsoft.Commerce.Contracts and the Microsoft.Commerce.Broker assemblies. Now make your new class derive from the base object Microsoft.Commerce.Providers.Components.OperationSequanceComponent and overide the ExecuteQueryMethod. Your screen will then look something similar ot this: As all we are doing on this component is conducting a search we are only interested in the ExecuteQuery method. This method accepts three arguments, queryOperation, operationCache, and response. The queryOperation will be the object in which we receive our search parameters, the cache allows access to the Commerce Server cache allowing us to store regulary accessed information, and the response object is the object which we will return the result of our search upon. Inside this method is simply where we are going to inject our logic for our third party search platform. As I am not going to explain the inner-workings of actually making a SOLR call, I'll simply provide the sample code here. I would highly recommend however looking at the SolrNet wiki as they have some great explinations of how the API works. What you will find however is that there are some further extensions required when attempting to integrate a custom search provider. Firstly you out of the box the CommerceQueryOperation you will receive into the method when conducting a search against a catalog is specifically geared towards a SQL Full Text Search with properties such as a Where clause. To make the operation you receive more relevant you will need to create another class, this time derived from Microsoft.Commerce.Contract.Messages.CommerceSearchCriteria and within this you need to detail the properties you will require to allow you to submit as parameters to the SOLR search API. My exmaple looks like this: [DataContract(Namespace = "http://schemas.microsoft.com/microsoft-multi-channel-commerce-foundation/types/2008/03")] public class CommerceCatalogSolrSearch : CommerceSearchCriteria { private Dictionary<string, string> _facetQueries;   public CommerceCatalogSolrSearch() { _facetQueries = new Dictionary<String, String>();   }     public Dictionary<String, String> FacetQueries { get { return _facetQueries; } set { _facetQueries = value; } }   public String SearchPhrase{ get; set; } public int PageIndex { get; set; } public int PageSize { get; set; } public IEnumerable<String> Facets { get; set; }   public string Sort { get; set; }   public new int FirstItemIndex { get { return (PageIndex-1)*PageSize; } }   public int LastItemIndex { get { return FirstItemIndex + PageSize; } } }  To allow you to construct a CommerceQueryOperation call within the API you will also need to construct another class to derived from Microsoft.Commerce.Common.MessageBuilders.CommerceSearchCriteriaBuilder and is simply used to construct an instance of the CommerceQueryOperation you have just created and expose the properties you want set. My Message builder looks like this: public class CommerceCatalogSolrSearchBuilder : CommerceSearchCriteriaBuilder { private CommerceCatalogSolrSearch _solrSearch;   public CommerceCatalogSolrSearchBuilder() { _solrSearch = new CommerceCatalogSolrSearch(); }   public String SearchPhrase { get { return _solrSearch.SearchPhrase; } set { _solrSearch.SearchPhrase = value; } }   public int PageIndex { get { return _solrSearch.PageIndex; } set { _solrSearch.PageIndex = value; } }   public int PageSize { get { return _solrSearch.PageSize; } set { _solrSearch.PageSize = value; } }   public Dictionary<String,String> FacetQueries { get { return _solrSearch.FacetQueries; } set { _solrSearch.FacetQueries = value; } }   public String[] Facets { get { return _solrSearch.Facets.ToArray(); } set { _solrSearch.Facets = value; } } public override CommerceSearchCriteria ToSearchCriteria() { return _solrSearch; } }  Once you have these two classes in place you can now safely cast the CommerceOperation you receive as an argument of the overidden ExecuteQuery method in the SequenceComponent to the CommerceCatalogSolrSearch operation you have just created, e.g. public CommerceCatalogSolrSearch TryGetSearchCriteria(CommerceOperation operation) { var searchCriteria = operation as CommerceQueryOperation; if (searchCriteria == null) throw new Exception("No search criteria present");   var local = (CommerceCatalogSolrSearch) searchCriteria.SearchCriteria; if (local == null) throw new Exception("Unexpected Search Criteria in Operation");   return local; }  Now you have all of your search parameters present, you can go off an call the external search platform API. You will of-course get proprietry objects returned, so the next step in the process is to convert the results being returned back into CommerceEntities. You do this via another extensibility point within the Commerce Server API called translatators. Translators are another separate class, this time derived inheriting the interface Microsoft.Commerce.Providers.Translators.IToCommerceEntityTranslator . As you can imaginge this interface is specific for the conversion of the object TO a CommerceEntity, you will need to implement a separate interface if you also need to go in the opposite direction. If you implement the required method for the interace you will get a single translate method which has a source onkect, destination CommerceEntity, and a collection of properties as arguments. For simplicity sake in this example I have hard-coded the mappings, however best practice would dictate you map the objects using your metadatadefintions.xml file . Once complete your translator would look something like the following: public class SolrEntityTranslator : IToCommerceEntityTranslator { #region IToCommerceEntityTranslator Members   public void Translate(object source, CommerceEntity destinationCommerceEntity, CommercePropertyCollection propertiesToReturn) { if (source.GetType().Equals(typeof (SearchProduct))) { var searchResult = (SearchProduct) source;   destinationCommerceEntity.Id = searchResult.ProductId; destinationCommerceEntity.SetPropertyValue("DisplayName", searchResult.Title); destinationCommerceEntity.ModelName = "Product";   } }  Once you have a translator in place you can then safely map the results of your search platform into Commerce Entities and attach them on to the CommerceResponse object in a fashion similar to this: foreach (SearchProduct result in matchingProducts) { var destinationEntity = new CommerceEntity(_returnModelName);   Translator.ToCommerceEntity(result, destinationEntity, _queryOperation.Model.Properties); response.CommerceEntities.Add(destinationEntity); }  In SOLR I actually have two objects being returned – a product, and a collection of facets so I have an additional translator for facet (which maps to a custom facet CommerceEntity) and my facet response from SOLR is passed into the Translator helper class seperatley. When all of this is pieced together you have sucessfully completed the extensiblity point coding. You would have created a new OperationSequanceComponent, a custom SearchCritiera object and message builder class, and translators to convert the objects into Commerce Entities. Now you simply need to configure them, and can start calling them in your code. Make sure you sign you assembly, compile it and identiy its signature. Next you need to put this a reference of your new assembly into the Channel.Config configuration file replacing that of the existing SQL Full Text component: You will also need to add your translators to the Translators node of your Channel.Config too: Lastly add any custom CommerceEntities you have developed to your MetaDataDefintions.xml file. Your configuration is now complete, and you should now be able to happily make a call to the Commerce Foundation API, which will act as a proxy to your third party search platform and return back CommerceEntities of your search results. If you require data to be enriched, or logged, or any other logic applied then simply add further sequence components into the OperationSequence (obviously keeping the search response first) to the node of your Channel.Config file. Now to call your code you simply request it as per any other CommerceQuery operation, but taking into account you may be receiving multiple types of CommerceEntity returned: public KeyValuePair<FacetCollection ,List<Product>> DoFacetedProductQuerySearch(string searchPhrase, string orderKey, string sortOrder, int recordIndex, int recordsPerPage, Dictionary<string, string> facetQueries, out int totalItemCount) { var products = new List<Product>(); var query = new CommerceQuery<CatalogEntity, CommerceCatalogSolrSearchBuilder>();   query.SearchCriteria.PageIndex = recordIndex; query.SearchCriteria.PageSize = recordsPerPage; query.SearchCriteria.SearchPhrase = searchPhrase; query.SearchCriteria.FacetQueries = facetQueries;     totalItemCount = 0; CommerceResponse response = SiteContext.ProcessRequest(query.ToRequest()); var queryResponse = response.OperationResponses[0] as CommerceQueryOperationResponse;   // No results. Return the empty list if (queryResponse != null && queryResponse.CommerceEntities.Count == 0) return new KeyValuePair<FacetCollection, List<Product>>();   totalItemCount = (int)queryResponse.TotalItemCount;   // Prepare a multi-operation to retrieve the product variants var multiOperation = new CommerceMultiOperation();     //Add products to results foreach (Product product in queryResponse.CommerceEntities.Where(x => x.ModelName == "Product")) { var productQuery = new CommerceQuery<Product>(Product.ModelNameDefinition); productQuery.SearchCriteria.Model.Id = product.Id; productQuery.SearchCriteria.Model.CatalogId = product.CatalogId;   var variantQuery = new CommerceQueryRelatedItem<Variant>(Product.RelationshipName.Variants);   productQuery.RelatedOperations.Add(variantQuery);   multiOperation.Add(productQuery); }   CommerceResponse variantsResponse = SiteContext.ProcessRequest(multiOperation.ToRequest()); foreach (CommerceQueryOperationResponse queryOpResponse in variantsResponse.OperationResponses) { if (queryOpResponse.CommerceEntities.Count() > 0) products.Add(queryOpResponse.CommerceEntities[0]); }   //Get facet collection FacetCollection facetCollection = queryResponse.CommerceEntities.Where(x => x.ModelName == "FacetCollection").FirstOrDefault();     return new KeyValuePair<FacetCollection, List<Product>>(facetCollection, products); }    ..And that is it – simply a few classes and some configuration will allow you to extend the Commerce Server query operations to call a third party search platform, whilst still maintaing a unifed API in the remainder of your code. This logic stands for any extensibility within CommerceServer, which requires excution in a serial fashioon such as call to LOB systems or web service to validate or enrich data. Feel free to use this example on other applications, and if you have any questions please feel free to e-mail and I'll help out where I can!

    Read the article

  • Simple MSBuild Configuration: Updating Assemblies With A Version Number

    - by srkirkland
    When distributing a library you often run up against versioning problems, once facet of which is simply determining which version of that library your client is running.  Of course, each project in your solution has an AssemblyInfo.cs file which provides, among other things, the ability to set the Assembly name and version number.  Unfortunately, setting the assembly version here would require not only changing the version manually for each build (depending on your schedule), but keeping it in sync across all projects.  There are many ways to solve this versioning problem, and in this blog post I’m going to try to explain what I think is the easiest and most flexible solution.  I will walk you through using MSBuild to create a simple build script, and I’ll even show how to (optionally) integrate with a Team City build server.  All of the code from this post can be found at https://github.com/srkirkland/BuildVersion. Create CommonAssemblyInfo.cs The first step is to create a common location for the repeated assembly info that is spread across all of your projects.  Create a new solution-level file (I usually create a Build/ folder in the solution root, but anywhere reachable by all your projects will do) called CommonAssemblyInfo.cs.  In here you can put any information common to all your assemblies, including the version number.  An example CommonAssemblyInfo.cs is as follows: using System.Reflection; using System.Resources; using System.Runtime.InteropServices;   [assembly: AssemblyCompany("University of California, Davis")] [assembly: AssemblyProduct("BuildVersionTest")] [assembly: AssemblyCopyright("Scott Kirkland & UC Regents")] [assembly: AssemblyConfiguration("")] [assembly: AssemblyTrademark("")]   [assembly: ComVisible(false)]   [assembly: AssemblyVersion("1.2.3.4")] //Will be replaced   [assembly: NeutralResourcesLanguage("en-US")] .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; }   Cleanup AssemblyInfo.cs & Link CommonAssemblyInfo.cs For each of your projects, you’ll want to clean up your assembly info to contain only information that is unique to that assembly – everything else will go in the CommonAssemblyInfo.cs file.  For most of my projects, that just means setting the AssemblyTitle, though you may feel AssemblyDescription is warranted.  An example AssemblyInfo.cs file is as follows: using System.Reflection;   [assembly: AssemblyTitle("BuildVersionTest")] .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } Next, you need to “link” the CommonAssemblyinfo.cs file into your projects right beside your newly lean AssemblyInfo.cs file.  To do this, right click on your project and choose Add | Existing Item from the context menu.  Navigate to your CommonAssemblyinfo.cs file but instead of clicking Add, click the little down-arrow next to add and choose “Add as Link.”  You should see a little link graphic similar to this: We’ve actually reduced complexity a lot already, because if you build all of your assemblies will have the same common info, including the product name and our static (fake) assembly version.  Let’s take this one step further and introduce a build script. Create an MSBuild file What we want from the build script (for now) is basically just to have the common assembly version number changed via a parameter (eventually to be passed in by the build server) and then for the project to build.  Also we’d like to have a flexibility to define what build configuration to use (debug, release, etc). In order to find/replace the version number, we are going to use a Regular Expression to find and replace the text within your CommonAssemblyInfo.cs file.  There are many other ways to do this using community build task add-ins, but since we want to keep it simple let’s just define the Regular Expression task manually in a new file, Build.tasks (this example taken from the NuGet build.tasks file). <?xml version="1.0" encoding="utf-8"?> <Project ToolsVersion="4.0" DefaultTargets="Go" xmlns="http://schemas.microsoft.com/developer/msbuild/2003"> <UsingTask TaskName="RegexTransform" TaskFactory="CodeTaskFactory" AssemblyFile="$(MSBuildToolsPath)\Microsoft.Build.Tasks.v4.0.dll"> <ParameterGroup> <Items ParameterType="Microsoft.Build.Framework.ITaskItem[]" /> </ParameterGroup> <Task> <Using Namespace="System.IO" /> <Using Namespace="System.Text.RegularExpressions" /> <Using Namespace="Microsoft.Build.Framework" /> <Code Type="Fragment" Language="cs"> <![CDATA[ foreach(ITaskItem item in Items) { string fileName = item.GetMetadata("FullPath"); string find = item.GetMetadata("Find"); string replaceWith = item.GetMetadata("ReplaceWith"); if(!File.Exists(fileName)) { Log.LogError(null, null, null, null, 0, 0, 0, 0, String.Format("Could not find version file: {0}", fileName), new object[0]); } string content = File.ReadAllText(fileName); File.WriteAllText( fileName, Regex.Replace( content, find, replaceWith ) ); } ]]> </Code> </Task> </UsingTask> </Project> .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } If you glance at the code, you’ll see it’s really just going a Regex.Replace() on a given file, which is exactly what we need. Now we are ready to write our build file, called (by convention) Build.proj. <?xml version="1.0" encoding="utf-8"?> <Project ToolsVersion="4.0" DefaultTargets="Go" xmlns="http://schemas.microsoft.com/developer/msbuild/2003"> <Import Project="$(MSBuildProjectDirectory)\Build.tasks" /> <PropertyGroup> <Configuration Condition="'$(Configuration)' == ''">Debug</Configuration> <SolutionRoot>$(MSBuildProjectDirectory)</SolutionRoot> </PropertyGroup>   <ItemGroup> <RegexTransform Include="$(SolutionRoot)\CommonAssemblyInfo.cs"> <Find>(?&lt;major&gt;\d+)\.(?&lt;minor&gt;\d+)\.\d+\.(?&lt;revision&gt;\d+)</Find> <ReplaceWith>$(BUILD_NUMBER)</ReplaceWith> </RegexTransform> </ItemGroup>   <Target Name="Go" DependsOnTargets="UpdateAssemblyVersion; Build"> </Target>   <Target Name="UpdateAssemblyVersion" Condition="'$(BUILD_NUMBER)' != ''"> <RegexTransform Items="@(RegexTransform)" /> </Target>   <Target Name="Build"> <MSBuild Projects="$(SolutionRoot)\BuildVersionTest.sln" Targets="Build" /> </Target>   </Project> Reviewing this MSBuild file, we see that by default the “Go” target will be called, which in turn depends on “UpdateAssemblyVersion” and then “Build.”  We go ahead and import the Bulid.tasks file and then setup some handy properties for setting the build configuration and solution root (in this case, my build files are in the solution root, but we might want to create a Build/ directory later).  The rest of the file flows logically, we setup the RegexTransform to match version numbers such as <major>.<minor>.1.<revision> (1.2.3.4 in our example) and replace it with a $(BUILD_NUMBER) parameter which will be supplied externally.  The first target, “UpdateAssemblyVersion” just runs the RegexTransform, and the second target, “Build” just runs the default MSBuild on our solution. Testing the MSBuild file locally Now we have a build file which can replace assembly version numbers and build, so let’s setup a quick batch file to be able to build locally.  To do this you simply create a file called Build.cmd and have it call MSBuild on your Build.proj file.  I’ve added a bit more flexibility so you can specify build configuration and version number, which makes your Build.cmd look as follows: set config=%1 if "%config%" == "" ( set config=debug ) set version=%2 if "%version%" == "" ( set version=2.3.4.5 ) %WINDIR%\Microsoft.NET\Framework\v4.0.30319\msbuild Build.proj /p:Configuration="%config%" /p:build_number="%version%" .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } Now if you click on the Build.cmd file, you will get a default debug build using the version 2.3.4.5.  Let’s run it in a command window with the parameters set for a release build version 2.0.1.453.   Excellent!  We can now run one simple command and govern the build configuration and version number of our entire solution.  Each DLL produced will have the same version number, making determining which version of a library you are running very simple and accurate. Configure the build server (TeamCity) Of course you are not really going to want to run a build command manually every time, and typing in incrementing version numbers will also not be ideal.  A good solution is to have a computer (or set of computers) act as a build server and build your code for you, providing you a consistent environment, excellent reporting, and much more.  One of the most popular Build Servers is JetBrains’ TeamCity, and this last section will show you the few configuration parameters to use when setting up a build using your MSBuild file created earlier.  If you are using a different build server, the same principals should apply. First, when setting up the project you want to specify the “Build Number Format,” often given in the form <major>.<minor>.<revision>.<build>.  In this case you will set major/minor manually, and optionally revision (or you can use your VCS revision number with %build.vcs.number%), and then build using the {0} wildcard.  Thus your build number format might look like this: 2.0.1.{0}.  During each build, this value will be created and passed into the $BUILD_NUMBER variable of our Build.proj file, which then uses it to decorate your assemblies with the proper version. After setting up the build number, you must choose MSBuild as the Build Runner, then provide a path to your build file (Build.proj).  After specifying your MSBuild Version (equivalent to your .NET Framework Version), you have the option to specify targets (the default being “Go”) and additional MSBuild parameters.  The one parameter that is often useful is manually setting the configuration property (/p:Configuration="Release") if you want something other than the default (which is Debug in our example).  Your resulting configuration will look something like this: [Under General Settings] [Build Runner Settings]   Now every time your build is run, a newly incremented build version number will be generated and passed to MSBuild, which will then version your assemblies and build your solution.   A Quick Review Our goal was to version our output assemblies in an automated way, and we accomplished it by performing a few quick steps: Move the common assembly information, including version, into a linked CommonAssemblyInfo.cs file Create a simple MSBuild script to replace the common assembly version number and build your solution Direct your build server to use the created MSBuild script That’s really all there is to it.  You can find all of the code from this post at https://github.com/srkirkland/BuildVersion. Enjoy!

    Read the article

  • ASPNET WebAPI REST Guidance

    - by JoshReuben
    ASP.NET Web API is an ideal platform for building RESTful applications on the .NET Framework. While I may be more partial to NodeJS these days, there is no denying that WebAPI is a well engineered framework. What follows is my investigation of how to leverage WebAPI to construct a RESTful frontend API.   The Advantages of REST Methodology over SOAP Simpler API for CRUD ops Standardize Development methodology - consistent and intuitive Standards based à client interop Wide industry adoption, Ease of use à easy to add new devs Avoid service method signature blowout Smaller payloads than SOAP Stateless à no session data means multi-tenant scalability Cache-ability Testability   General RESTful API Design Overview · utilize HTTP Protocol - Usage of HTTP methods for CRUD, standard HTTP response codes, common HTTP headers and Mime Types · Resources are mapped to URLs, actions are mapped to verbs and the rest goes in the headers. · keep the API semantic, resource-centric – A RESTful, resource-oriented service exposes a URI for every piece of data the client might want to operate on. A REST-RPC Hybrid exposes a URI for every operation the client might perform: one URI to fetch a piece of data, a different URI to delete that same data. utilize Uri to specify CRUD op, version, language, output format: http://api.MyApp.com/{ver}/{lang}/{resource_type}/{resource_id}.{output_format}?{key&filters} · entity CRUD operations are matched to HTTP methods: · Create - POST / PUT · Read – GET - cacheable · Update – PUT · Delete - DELETE · Use Uris to represent a hierarchies - Resources in RESTful URLs are often chained · Statelessness allows for idempotency – apply an op multiple times without changing the result. POST is non-idempotent, the rest are idempotent (if DELETE flags records instead of deleting them). · Cache indication - Leverage HTTP headers to label cacheable content and indicate the permitted duration of cache · PUT vs POST - The client uses PUT when it determines which URI (Id key) the new resource should have. The client uses POST when the server determines they key. PUT takes a second param – the id. POST creates a new resource. The server assigns the URI for the new object and returns this URI as part of the response message. Note: The PUT method replaces the entire entity. That is, the client is expected to send a complete representation of the updated product. If you want to support partial updates, the PATCH method is preferred DELETE deletes a resource at a specified URI – typically takes an id param · Leverage Common HTTP Response Codes in response headers 200 OK: Success 201 Created - Used on POST request when creating a new resource. 304 Not Modified: no new data to return. 400 Bad Request: Invalid Request. 401 Unauthorized: Authentication. 403 Forbidden: Authorization 404 Not Found – entity does not exist. 406 Not Acceptable – bad params. 409 Conflict - For POST / PUT requests if the resource already exists. 500 Internal Server Error 503 Service Unavailable · Leverage uncommon HTTP Verbs to reduce payload sizes HEAD - retrieves just the resource meta-information. OPTIONS returns the actions supported for the specified resource. PATCH - partial modification of a resource. · When using PUT, POST or PATCH, send the data as a document in the body of the request. Don't use query parameters to alter state. · Utilize Headers for content negotiation, caching, authorization, throttling o Content Negotiation – choose representation (e.g. JSON or XML and version), language & compression. Signal via RequestHeader.Accept & ResponseHeader.Content-Type Accept: application/json;version=1.0 Accept-Language: en-US Accept-Charset: UTF-8 Accept-Encoding: gzip o Caching - ResponseHeader: Expires (absolute expiry time) or Cache-Control (relative expiry time) o Authorization - basic HTTP authentication uses the RequestHeader.Authorization to specify a base64 encoded string "username:password". can be used in combination with SSL/TLS (HTTPS) and leverage OAuth2 3rd party token-claims authorization. Authorization: Basic sQJlaTp5ZWFslylnaNZ= o Rate Limiting - Not currently part of HTTP so specify non-standard headers prefixed with X- in the ResponseHeader. X-RateLimit-Limit: 10000 X-RateLimit-Remaining: 9990 · HATEOAS Methodology - Hypermedia As The Engine Of Application State – leverage API as a state machine where resources are states and the transitions between states are links between resources and are included in their representation (hypermedia) – get API metadata signatures from the response Link header - in a truly REST based architecture any URL, except the initial URL, can be changed, even to other servers, without worrying about the client. · error responses - Do not just send back a 200 OK with every response. Response should consist of HTTP error status code (JQuery has automated support for this), A human readable message , A Link to a meaningful state transition , & the original data payload that was problematic. · the URIs will typically map to a server-side controller and a method name specified by the type of request method. Stuff all your calls into just four methods is not as crazy as it sounds. · Scoping - Path variables look like you’re traversing a hierarchy, and query variables look like you’re passing arguments into an algorithm · Mapping URIs to Controllers - have one controller for each resource is not a rule – can consolidate - route requests to the appropriate controller and action method · Keep URls Consistent - Sometimes it’s tempting to just shorten our URIs. not recommend this as this can cause confusion · Join Naming – for m-m entity relations there may be multiple hierarchy traversal paths · Routing – useful level of indirection for versioning, server backend mocking in development ASPNET WebAPI Considerations ASPNET WebAPI implements a lot (but not all) RESTful API design considerations as part of its infrastructure and via its coding convention. Overview When developing an API there are basically three main steps: 1. Plan out your URIs 2. Setup return values and response codes for your URIs 3. Implement a framework for your API.   Design · Leverage Models MVC folder · Repositories – support IoC for tests, abstraction · Create DTO classes – a level of indirection decouples & allows swap out · Self links can be generated using the UrlHelper · Use IQueryable to support projections across the wire · Models can support restful navigation properties – ICollection<T> · async mechanism for long running ops - return a response with a ticket – the client can then poll or be pushed the final result later. · Design for testability - Test using HttpClient , JQuery ( $.getJSON , $.each) , fiddler, browser debug. Leverage IDependencyResolver – IoC wrapper for mocking · Easy debugging - IE F12 developer tools: Network tab, Request Headers tab     Routing · HTTP request method is matched to the method name. (This rule applies only to GET, POST, PUT, and DELETE requests.) · {id}, if present, is matched to a method parameter named id. · Query parameters are matched to parameter names when possible · Done in config via Routes.MapHttpRoute – similar to MVC routing · Can alternatively: o decorate controller action methods with HttpDelete, HttpGet, HttpHead,HttpOptions, HttpPatch, HttpPost, or HttpPut., + the ActionAttribute o use AcceptVerbsAttribute to support other HTTP verbs: e.g. PATCH, HEAD o use NonActionAttribute to prevent a method from getting invoked as an action · route table Uris can support placeholders (via curly braces{}) – these can support default values and constraints, and optional values · The framework selects the first route in the route table that matches the URI. Response customization · Response code: By default, the Web API framework sets the response status code to 200 (OK). But according to the HTTP/1.1 protocol, when a POST request results in the creation of a resource, the server should reply with status 201 (Created). Non Get methods should return HttpResponseMessage · Location: When the server creates a resource, it should include the URI of the new resource in the Location header of the response. public HttpResponseMessage PostProduct(Product item) {     item = repository.Add(item);     var response = Request.CreateResponse<Product>(HttpStatusCode.Created, item);     string uri = Url.Link("DefaultApi", new { id = item.Id });     response.Headers.Location = new Uri(uri);     return response; } Validation · Decorate Models / DTOs with System.ComponentModel.DataAnnotations properties RequiredAttribute, RangeAttribute. · Check payloads using ModelState.IsValid · Under posting – leave out values in JSON payload à JSON formatter assigns a default value. Use with RequiredAttribute · Over-posting - if model has RO properties à use DTO instead of model · Can hook into pipeline by deriving from ActionFilterAttribute & overriding OnActionExecuting Config · Done in App_Start folder > WebApiConfig.cs – static Register method: HttpConfiguration param: The HttpConfiguration object contains the following members. Member Description DependencyResolver Enables dependency injection for controllers. Filters Action filters – e.g. exception filters. Formatters Media-type formatters. by default contains JsonFormatter, XmlFormatter IncludeErrorDetailPolicy Specifies whether the server should include error details, such as exception messages and stack traces, in HTTP response messages. Initializer A function that performs final initialization of the HttpConfiguration. MessageHandlers HTTP message handlers - plug into pipeline ParameterBindingRules A collection of rules for binding parameters on controller actions. Properties A generic property bag. Routes The collection of routes. Services The collection of services. · Configure JsonFormatter for circular references to support links: PreserveReferencesHandling.Objects Documentation generation · create a help page for a web API, by using the ApiExplorer class. · The ApiExplorer class provides descriptive information about the APIs exposed by a web API as an ApiDescription collection · create the help page as an MVC view public ILookup<string, ApiDescription> GetApis()         {             return _explorer.ApiDescriptions.ToLookup(                 api => api.ActionDescriptor.ControllerDescriptor.ControllerName); · provide documentation for your APIs by implementing the IDocumentationProvider interface. Documentation strings can come from any source that you like – e.g. extract XML comments or define custom attributes to apply to the controller [ApiDoc("Gets a product by ID.")] [ApiParameterDoc("id", "The ID of the product.")] public HttpResponseMessage Get(int id) · GlobalConfiguration.Configuration.Services – add the documentation Provider · To hide an API from the ApiExplorer, add the ApiExplorerSettingsAttribute Plugging into the Message Handler pipeline · Plug into request / response pipeline – derive from DelegatingHandler and override theSendAsync method – e.g. for logging error codes, adding a custom response header · Can be applied globally or to a specific route Exception Handling · Throw HttpResponseException on method failures – specify HttpStatusCode enum value – examine this enum, as its values map well to typical op problems · Exception filters – derive from ExceptionFilterAttribute & override OnException. Apply on Controller or action methods, or add to global HttpConfiguration.Filters collection · HttpError object provides a consistent way to return error information in the HttpResponseException response body. · For model validation, you can pass the model state to CreateErrorResponse, to include the validation errors in the response public HttpResponseMessage PostProduct(Product item) {     if (!ModelState.IsValid)     {         return Request.CreateErrorResponse(HttpStatusCode.BadRequest, ModelState); Cookie Management · Cookie header in request and Set-Cookie headers in a response - Collection of CookieState objects · Specify Expiry, max-age resp.Headers.AddCookies(new CookieHeaderValue[] { cookie }); Internet Media Types, formatters and serialization · Defaults to application/json · Request Accept header and response Content-Type header · determines how Web API serializes and deserializes the HTTP message body. There is built-in support for XML, JSON, and form-urlencoded data · customizable formatters can be inserted into the pipeline · POCO serialization is opt out via JsonIgnoreAttribute, or use DataMemberAttribute for optin · JSON serializer leverages NewtonSoft Json.NET · loosely structured JSON objects are serialzed as JObject which derives from Dynamic · to handle circular references in json: json.SerializerSettings.PreserveReferencesHandling =    PreserveReferencesHandling.All à {"$ref":"1"}. · To preserve object references in XML [DataContract(IsReference=true)] · Content negotiation Accept: Which media types are acceptable for the response, such as “application/json,” “application/xml,” or a custom media type such as "application/vnd.example+xml" Accept-Charset: Which character sets are acceptable, such as UTF-8 or ISO 8859-1. Accept-Encoding: Which content encodings are acceptable, such as gzip. Accept-Language: The preferred natural language, such as “en-us”. o Web API uses the Accept and Accept-Charset headers. (At this time, there is no built-in support for Accept-Encoding or Accept-Language.) · Controller methods can take JSON representations of DTOs as params – auto-deserialization · Typical JQuery GET request: function find() {     var id = $('#prodId').val();     $.getJSON("api/products/" + id,         function (data) {             var str = data.Name + ': $' + data.Price;             $('#product').text(str);         })     .fail(         function (jqXHR, textStatus, err) {             $('#product').text('Error: ' + err);         }); }            · Typical GET response: HTTP/1.1 200 OK Server: ASP.NET Development Server/10.0.0.0 Date: Mon, 18 Jun 2012 04:30:33 GMT X-AspNet-Version: 4.0.30319 Cache-Control: no-cache Pragma: no-cache Expires: -1 Content-Type: application/json; charset=utf-8 Content-Length: 175 Connection: Close [{"Id":1,"Name":"TomatoSoup","Price":1.39,"ActualCost":0.99},{"Id":2,"Name":"Hammer", "Price":16.99,"ActualCost":10.00},{"Id":3,"Name":"Yo yo","Price":6.99,"ActualCost": 2.05}] True OData support · Leverage Query Options $filter, $orderby, $top and $skip to shape the results of controller actions annotated with the [Queryable]attribute. [Queryable]  public IQueryable<Supplier> GetSuppliers()  · Query: ~/Suppliers?$filter=Name eq ‘Microsoft’ · Applies the following selection filter on the server: GetSuppliers().Where(s => s.Name == “Microsoft”)  · Will pass the result to the formatter. · true support for the OData format is still limited - no support for creates, updates, deletes, $metadata and code generation etc · vnext: ability to configure how EditLinks, SelfLinks and Ids are generated Self Hosting no dependency on ASPNET or IIS: using (var server = new HttpSelfHostServer(config)) {     server.OpenAsync().Wait(); Tracing · tracability tools, metrics – e.g. send to nagios · use your choice of tracing/logging library, whether that is ETW,NLog, log4net, or simply System.Diagnostics.Trace. · To collect traces, implement the ITraceWriter interface public class SimpleTracer : ITraceWriter {     public void Trace(HttpRequestMessage request, string category, TraceLevel level,         Action<TraceRecord> traceAction)     {         TraceRecord rec = new TraceRecord(request, category, level);         traceAction(rec);         WriteTrace(rec); · register the service with config · programmatically trace – has helper extension methods: Configuration.Services.GetTraceWriter().Info( · Performance tracing - pipeline writes traces at the beginning and end of an operation - TraceRecord class includes aTimeStamp property, Kind property set to TraceKind.Begin / End Security · Roles class methods: RoleExists, AddUserToRole · WebSecurity class methods: UserExists, .CreateUserAndAccount · Request.IsAuthenticated · Leverage HTTP 401 (Unauthorized) response · [AuthorizeAttribute(Roles="Administrator")] – can be applied to Controller or its action methods · See section in WebApi document on "Claim-based-security for ASP.NET Web APIs using DotNetOpenAuth" – adapt this to STS.--> Web API Host exposes secured Web APIs which can only be accessed by presenting a valid token issued by the trusted issuer. http://zamd.net/2012/05/04/claim-based-security-for-asp-net-web-apis-using-dotnetopenauth/ · Use MVC membership provider infrastructure and add a DelegatingHandler child class to the WebAPI pipeline - http://stackoverflow.com/questions/11535075/asp-net-mvc-4-web-api-authentication-with-membership-provider - this will perform the login actions · Then use AuthorizeAttribute on controllers and methods for role mapping- http://sixgun.wordpress.com/2012/02/29/asp-net-web-api-basic-authentication/ · Alternate option here is to rely on MVC App : http://forums.asp.net/t/1831767.aspx/1

    Read the article

  • Creating Item Templates as Visual Studio 2010 Extensions

    - by maziar
    Technorati Tags: Visual Studio 2010 Extension,T4 Template,VSIX,Item Template Wizard This blog post briefly introduces creation of an item template as a Visual studio 2010 extension. Problem specification Assume you are writing a Framework for data-oriented applications and you decide to include all your application messages in a SQL server database table. After creating the table, your create a class in your framework for getting messages with a string key specified.   var message = FrameworkMessages.Get("ChangesSavedSuccess");   Everyone would say this code is so error prone, because message keys are not strong-typed, and might create errors in application that are not caught in tests. So we think of a way to make it strong-typed, i.e. create a class to use it like this:   var message = Messages.ChangesSavedSuccess; in Messages class the code looks like this: public string ChangesSavedSuccess {     get { return FrameworkMessages.Get("ChangesSavedSuccess"); } }   And clearly, we are not going to create the Messages class manually; we need a class generator for it.   Again assume that the application(s) that intend to use our framework, contain multiple sub-systems. So each sub-system need to have it’s own strong-typed message class that call FrameworkMessages.Get method. So we would like to make our code generator an Item Template so that each developer would easily add the item to his project and no other works would be necessary.   Solution We create a T4 Text Template to generate our strong typed class from database. Then create a Visual Studio Item Template with this generator and publish it.   What Are T4 Templates You might be already familiar with T4 templates. If it’s so, you can skip this section. T4 Text Template is a fine Visual Studio file type (.tt) that generates output text. This file is a mixture of text blocks and code logic (in C# or VB). For example, you can generate HTML files, C# classes, Resource files and etc with use of a T4 template.   Syntax highlighting In Visual Studio 2010 a T4 Template by default will no be syntax highlighted and no auto-complete is supported for it. But there is a fine visual studio extension named ‘Visual T4’ which can be downloaded free from VisualStudioGallery. This tool offers IntelliSense, syntax coloring, validation, transformation preview and more for T4 templates.     How Item Templates work in Visual Studio Visual studio extensions allow us to add some functionalities to visual studio. In our case we need to create a .vsix file that adds a template to visual studio item templates. Item templates are zip files containing the template file and a meta-data file with .vstemplate extension. This .vstemplate file is an XML file that provides some information about the template. A .vsix file also is a zip file (renamed to .vsix) that are open with visual studio extension installer. (Re-installing a vsix file requires that old one to be uninstalled from VS: Tools > Extension Manager.) Installing a vsix will need Visual Studio to be closed and re-opened to take effect. Visual studio extension installer will easily find the item template’s zip file and copy it to visual studio’s template items folder. You can find other visual studio templates in [<VS Install Path>\Common7\IDE\ItemTemplates] and you can edit them; but be very careful with VS default templates.   How Can I Create a VSIX file 1. Visual Studio SDK depending on your Visual Studio’s version, you need to download Microsoft Visual Studio SDK. Note that if you have VS 2010 SP1, you will need to download and install VS 2010 SP1 SDK; VS 2010 SDK will not be installed (unless you change registry value that indicated your service pack number). Here is the link for VS 2010 SP1 SDK. After downloading, Run it and follow the wizard to complete the installation.   2. Create the file you want to make it an Item Template Create a project (or use an existing one) and add you file, edit it to make it work fine.   Back to our own problem, we need to create a T4 (.tt) template. VS-Prok: Add > New Item > General > Text Template Type a file name, ex. Message.tt, and press Add. Create the T4 template file (in this blog I do not intend to include T4 syntaxes so I just write down the code which is clear enough for you to understand)   <#@ template debug="false" hostspecific="true" language="C#" #> <#@ output extension=".cs" #> <#@ Assembly Name="System.Data" #> <#@ Import Namespace="System.Data.SqlClient" #> <#@ Import Namespace="System.Text" #> <#@ Import Namespace="System.IO" #> <#     var connectionString = "server=Maziar-PC; Database=MyDatabase; Integrated Security=True";     var systemName = "Sys1";     var builder = new StringBuilder();     using (var connection = new SqlConnection(connectionString))     {         connection.Open();         var command = connection.CreateCommand();         command.CommandText = string.Format("SELECT [Key] FROM [Message] WHERE System = '{0}'", systemName);         var reader = command.ExecuteReader();         while (reader.Read())         {             builder.AppendFormat("        public static string {0} {{ get {{ return FrameworkMessages.Get(\"{0}\"); }} }}\r\n", reader[0]);         }     } #> namespace <#= System.Runtime.Remoting.Messaging.CallContext.LogicalGetData("NamespaceHint") #> {     public static class <#= Path.GetFileNameWithoutExtension(Host.TemplateFile) #>     { <#= builder.ToString() #>     } } As you can see the T4 template connects to a database, reads message keys and generates a class. Here is the output: namespace MyProject.MyFolder {     public static class Messages     {         public static string ChangesSavedSuccess { get { return FrameworkMessages.Get("ChangesSavedSuccess"); } }         public static string ErrorSavingChanges { get { return FrameworkMessages.Get("ErrorSavingChanges"); } }     } }   The output looks fine but there is one problem. The connectionString and systemName are hard coded. so how can I create an flexible item template? One of features of item templates in visual studio is that you can create a designer wizard for your item template, so I can get connection information and system name there. now lets go on creating the vsix file.   3. Create Template In visual studio click on File > Export Template a wizard will show up. if first step click on Item Template on in the combo box select the project that contains Messages.tt. click next. Select Messages.tt from list in second step. click next. In the third step, you should choose References. For this template, System and System.Data are needed so choose them. click next. write down template name, description, if you like add a .ico file as the icon file and also preview image. Uncheck automatically add the templare … . Copy the output location in clip board. click finish.     4. Create VSIX Project In VS, Click File > New > Project > Extensibility > VSIX Project Type a name, ex. FrameworkMessages, Location, etc. The project will include a .vsixmanifest file. Fill in fields like Author, Product Name, Description, etc.   In Content section, click on Add Content. choose content type as Item Template. choose source as file. remember you have the template file address in clipboard? now paste it in front of file. click OK.     5. Build VSIX Project That’s it, build the project and go to output directory. You see a .vsix file. you can run it now. After restarting VS, if you click on a project > Add > New Item, you will see your item in list and you can add it. When you add this item to a project, if it has not references to System and System.Data they will be added. but the problem mentioned in step 2 is seen now.     6. Create Design Wizard for your Item Template Create a project i.e. Windows Application named ‘Framework.Messages.Design’, but better change its output type to Class Library. Add References to Microsoft.VisualStudio.TemplateWizardInterface and envdte Add a class Named MessagesDesigner in your project and Implement IWizard interface in it. This is what you should write: using System; using System.Collections.Generic; using System.Linq; using System.Text; using Microsoft.VisualStudio.TemplateWizard; using EnvDTE; namespace Framework.Messages.Design {     class MessageDesigner : IWizard     {         private bool CanAddProjectItem;         public void RunStarted(object automationObject, Dictionary<string, string> replacementsDictionary, WizardRunKind runKind, object[] customParams)         {             // Prompt user for Connection String and System Name in a Windows form (ShowDialog) here             // (try to provide good interface)             // if user clicks on cancel of your windows form return;             string connectionString = "connection;string"; // Set value from the form             string systemName = "system;name"; // Set value from the form             CanAddProjectItem = true;             replacementsDictionary.Add("$connectionString$", connectionString);             replacementsDictionary.Add("$systemName$", systemName);         }         public bool ShouldAddProjectItem(string filePath)         {             return CanAddProjectItem;         }         public void BeforeOpeningFile(ProjectItem projectItem)         {         }         public void ProjectFinishedGenerating(Project project)         {         }         public void ProjectItemFinishedGenerating(ProjectItem projectItem)         {         }         public void RunFinished()         {         }     } }   before your code runs  replacementsDictionary contains list of default template parameters. After that, two other parameters are added. Now build this project and copy the output assembly to [<VS Install Path>\Common7\IDE] folder.   your designer is ready.     The template that you had created is now added to your VSIX project. In windows explorer open your template zip file (extract it somewhere). open the .vstemplate file. first of all remove <ProjectItem SubType="Code" TargetFileName="$fileinputname$.cs" ReplaceParameters="true">Messages.cs</ProjectItem> because the .cs file is not to be intended to be a part of template and it will be generated. change value of ReplaceParameters for your .tt file to true to enable parameter replacement in this file. now right after </TemplateContent> end element, write this: <WizardExtension>   <Assembly>Framework.Messages.Design</Assembly>   <FullClassName>Framework.Messages.Design.MessageDesigner</FullClassName> </WizardExtension>   one other thing that you should do is to edit your .tt file and remove your .cs file. Lines 8 and 9 of your .tt file should be:     var connectionString = "$connectionString$";     var systemName = "$systemName$"; this parameters will be replaced when the item is added to a project. Save the contents to a zip file with same file name and replace the original file.   now again build your VSIX project, uninstall your extension. close VS. now run .vsix file. open vs, add an item of type messages to your project, bingo, your wizard form will show up. fill the fields and click ok, values are replaced in .tt file added.     that’s it. tried so hard to make this post brief, hope it was not so long…   Cheers Maziar

    Read the article

  • OIM 11g : Multi-thread approach for writing custom scheduled job

    - by Saravanan V S
    In this post I have shared my experience of designing and developing an OIM schedule job that uses multi threaded approach for updating data in OIM using APIs.  I have used thread pool (in particular fixed thread pool) pattern in developing the OIM schedule job. The thread pooling pattern has noted advantages compared to thread per task approach. I have listed few of the advantage here ·         Threads are reused ·         Creation and tear-down cost of thread is reduced ·         Task execution latency is reduced ·         Improved performance ·         Controlled and efficient management of memory and resources used by threads More about java thread pool http://docs.oracle.com/javase/tutorial/essential/concurrency/pools.html The following diagram depicts the high-level architectural diagram of the schedule job that process input from a flat file to update OIM process form data using fixed thread pool approach    The custom scheduled job shared in this post is developed to meet following requirement 1)      Need to process a CSV extract that contains identity, account identifying key and list of data to be updated on an existing OIM resource account. 2)      CSV file can contain data for multiple resources configured in OIM 3)      List of attribute to update and mapping between CSV column to OIM fields may vary between resources The following are three Java class developed for this requirement (I have given only prototype of the code that explains how to use thread pools in schedule task) CustomScheduler.java - Implementation of TaskSupport class that reads and passes the parameters configured on the schedule job to Thread Executor class. package com.oracle.oim.scheduler; import java.util.HashMap; import com.oracle.oim.bo.MultiThreadDataRecon; import oracle.iam.scheduler.vo.TaskSupport; public class CustomScheduler extends TaskSupport {      public void execute(HashMap options) throws Exception {             /*  Read Schedule Job Parameters */             String param1 = (String) options.get(“Parameter1”);             .             int noOfThread = (int) options.get(“No of Threads”);             .             String paramn = (int) options.get(“ParamterN”); /* Provide all the required input configured on schedule job to Thread Pool Executor implementation class like 1) Name of the file, 2) Delimiter 3) Header Row Numer 4) Line Escape character 5) Config and resource map lookup 6) No the thread to create */ new MultiThreadDataRecon(all_required_parameters, noOfThreads).reconcile();       }       public HashMap getAttributes() { return null; }       public void setAttributes() {       } } MultiThreadDataRecon.java – Helper class that reads data from input file, initialize the thread executor and builds the task queue. package com.oracle.oim.bo; import <required file IO classes>; import  <required java.util classes>; import  <required OIM API classes>; import <csv reader api>; public class MultiThreadDataRecon {  private int noOfThreads;  private ExecutorService threadExecutor = null;  public MetaDataRecon(<required params>, int noOfThreads)  {       //Store parameters locally       .       .       this.noOfThread = noOfThread;  }        /**        *  Initialize         */  private void init() throws Exception {       try {             // Initialize CSV file reader API objects             // Initialize OIM API objects             /* Initialize Fixed Thread Pool Executor class if no of threads                 configured is more than 1 */             if (noOfThreads > 1) {                   threadExecutor = Executors.newFixedThreadPool(noOfThreads);             } else {                   threadExecutor = Executors.newSingleThreadExecutor();             }             /* Initialize TaskProcess clas s which will be executing task                 from the Queue */                TaskProcessor.initializeConfig(params);       } catch (***Exception e) {                   // TO DO       }  }       /**        *  Method to reconcile data from CSV to OIM        */ public void reconcile() throws Exception {        try {             init();             while(<csv file has line>){                   processRow(line);             }             /* Initiate thread shutdown */             threadExecutor.shutdown();             while (!threadExecutor.isTerminated()) {                 // Wait for all task to complete.             }            } catch (Exception e) {                   // TO DO            } finally {                   try {                         //Close all the file handles                   } catch (IOException e) {                         //TO DO                   }             }       }       /**        * Method to process         */       private void processRow(String row) {             // Create task processor instance with the row data              // Following code push the task to work queue and wait for next                available thread to execute             threadExecutor.execute(new TaskProcessor(rowData));       } } TaskProcessor.java – Implementation of “Runnable” interface that executes the required business logic to update data in OIM. package com.oracle.oim.bo; import <required APIs> class TaskProcessor implements Runnable {       //Initialize required member variables       /**        * Constructor        */       public TaskProcessor(<row data>) {             // Initialize and parse csv row       }       /*       *  Method to initialize required object for task execution       */       public static void initializeConfig(<params>) {             // Process param and initialize the required configs and object       }           /*        * (non-Javadoc)        *         * @see java.lang.Runnable#run()        */            public void run() {             if (<is csv data valid>){                   processData();             }       }  /**   * Process the the received CSV input   */  private void processData() {     try{       //Find the user in OIM using the identity matching key value from CSV       // Find the account to be update from user’s account based on account identifying key on CSV       // Update the account with data from CSV       }catch(***Exception e){           //TO DO       }   } }

    Read the article

  • Redmine on Apache2 with Passenger issue

    - by nkr1pt
    I installed Redmine and run it in Apache2 with the Passenger module. Apache2 boots, Passenger module gets loaded and the Redmine welcome page is shown, however when trying to login or navigate to other parts of the Redmine site, the browser keeps loading and loading and loading forever, although the Redmine production.log indicates redirects and HTTP 200 codes in the header, so everything seems to work correctly according to the log. I tested in various browsers. Does anyone have an idea what could be wrong? I will add apache configuration and some relevant log snippets from both apache and redmine hereafter. Apache2 Redmine configuration: DocumentRoot /var/www <Directory /var/www/redmine> RailsEnv production AllowOverride all RailsBaseURI /redmine PassengerResolveSymLinksInDocumentRoot on </Directory> Apache2 error log after booting Apache: [Wed Feb 09 19:59:58 2011] [notice] Apache/2.2.14 (Ubuntu) Phusion_Passenger/3.0.2 DAV/2 SVN/1.6.6 configured -- resuming normal operations Redmine production log after logging in: Logfile created on Wed Feb 09 20:01:40 +0100 2011 Processing WelcomeController#index (for 192.168.1.55 at 2011-02-09 20:01:48) [GET] Parameters: {"action"=>"index", "controller"=>"welcome"} Rendering template within layouts/base Rendering welcome/index Completed in 220ms (View: 96, DB: 16) | 200 OK [http://sirius/redmine] Processing AccountController#login (for 192.168.1.55 at 2011-02-09 20:03:17) [GET] Parameters: {"action"=>"login", "controller"=>"account"} Rendering template within layouts/base Rendering account/login Completed in 85ms (View: 63, DB: 1) | 200 OK [http://sirius/redmine/login] Processing AccountController#login (for 192.168.1.55 at 2011-02-09 20:03:20) [POST] Parameters: {"back_url"=>"http%3A%2F%2Fsirius%2Fredmine", "action"=>"login", "authenticity_token"=>"cEMUZHhRKJU8w3p6d+xQQhJTk4/pnnzUdg5g5fwhxDU=", "username"=>"admin", "controller"=>"account", "password"=>"[FILTERED]", "login"=>"Login \302\273"} Redirected to http://sirius/redmine Completed in 37ms (DB: 6) | 302 Found [http://sirius/redmine/login] Processing WelcomeController#index (for 192.168.1.55 at 2011-02-09 20:03:20) [GET] Parameters: {"action"=>"index", "controller"=>"welcome"} Rendering template within layouts/base Rendering welcome/index Completed in 100ms (View: 77, DB: 6) | 200 OK [http://sirius/redmine] Apache2 error log afterwards: [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/apache/mod_instaweb.cc(247)] ModPagespeed OutputFilter called for request /redmine/login [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/apache/mod_instaweb.cc(272)] unparsed=/redmine/login, absolute_url=http://sirius/redmine/login [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine/login:1: HtmlParse::StartParse [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/apache/mod_instaweb.cc(299)] Request headers:\nHTTP/1.1 0 Internal Server Error\r\nHost: sirius\r\nUser-Agent: Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.8) Gecko/20100723 Ubuntu/10.04 (lucid) Firefox/3.6.8\r\nAccept: text/html,application/xhtml+xml,application/xml;q=0.9,/;q=0.8\r\nAccept-Language: en-us,en;q=0.5\r\nAccept-Encoding: gzip,deflate\r\nAccept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7\r\nKeep-Alive: 115\r\nConnection: keep-alive\r\nReferer: http://sirius/redmine\r\nCookie: _redmine_session=BAh7BjoPc2Vzc2lvbl9pZCIlNmVlMzFiMDc4MWQxZDU5ZTI5MTk2NjU0NGY3MzJmYzQ%3D--ea4b7adbc35551051632b5544faaad138ae08d90\r\n\r\n [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/apache/mod_instaweb.cc(302)] request-filename=/var/www/redmine/login, uri=/redmine/login [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/apache/mod_instaweb.cc(319)] ModPagespeed Response headers:\nHTTP/1.1 200 OK\r\nStatus: 200\r\nX-Mod-Pagespeed: 0.9.0.0-128\r\n\r\n [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine/login:1: 2157us: HtmlParse::Flush [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine/login:1: 2272us: HtmlParse::CoalesceAdjacentCharactersNodes [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine/login:1: 2342us: HtmlParse::ApplyFilter:AddHead [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine/login:1: 2407us: HtmlParse::SanityCheck [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine/login:1: 2504us: HtmlParse::ApplyFilter:CssCombine [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/apache/serf_url_async_fetcher.cc(632)] Initiating async fetch for http://sirius/redmine/stylesheets/application.css?1296181549 [Wed Feb 09 20:03:17 2011] [warn] [0209/200317:WARNING:net/instaweb/util/google_message_handler.cc(32)] Failed to create or read input resource /redmine/stylesheets/application.css?1296181549 [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/apache/serf_url_async_fetcher.cc(632)] Initiating async fetch for http://sirius/redmine/stylesheets/jstoolbar.css?1296181549 [Wed Feb 09 20:03:17 2011] [warn] [0209/200317:WARNING:net/instaweb/util/google_message_handler.cc(32)] Failed to create or read input resource /redmine/stylesheets/jstoolbar.css?1296181549 [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine/login:1: 3642us: HtmlParse::ApplyFilter:CssFilter [Wed Feb 09 20:03:17 2011] [error] [0209/200317:ERROR:net/instaweb/util/google_message_handler.cc(54)] net/instaweb/apache/serf_url_async_fetcher.cc:506: Creating connection [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/apache/serf_url_async_fetcher.cc(632)] Initiating async fetch for http://sirius/redmine/stylesheets/application.css?1296181549 [Wed Feb 09 20:03:17 2011] [error] [0209/200317:ERROR:net/instaweb/util/google_message_handler.cc(54)] http://sirius/redmine/login:9: Failed to load resource http://sirius/redmine/stylesheets/application.css?1296181549 [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/apache/serf_url_async_fetcher.cc(632)] Initiating async fetch for http://sirius/redmine/stylesheets/jstoolbar.css?1296181549 [Wed Feb 09 20:03:17 2011] [error] [0209/200317:ERROR:net/instaweb/util/google_message_handler.cc(54)] http://sirius/redmine/login:17: Failed to load resource http://sirius/redmine/stylesheets/jstoolbar.css?1296181549 [Wed Feb 09 20:03:17 2011] [error] [0209/200317:ERROR:net/instaweb/util/google_message_handler.cc(54)] net/instaweb/apache/serf_url_async_fetcher.cc:506: Failed to load resource http://sirius/redmine/stylesheets/jstoolbar.css?1296181549 [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine/login:1: 4863us: HtmlParse::ApplyFilter:Javascript [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine/login:11: Found script with src /redmine/javascripts/prototype.js?1296181549 [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/apache/serf_url_async_fetcher.cc(632)] Initiating async fetch for http://sirius/redmine/javascripts/prototype.js?1296181549 [Wed Feb 09 20:03:17 2011] [error] [0209/200317:ERROR:net/instaweb/util/google_message_handler.cc(54)] net/instaweb/apache/serf_url_async_fetcher.cc:506: Creating connection [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(29)] http://sirius/redmine/login: Couldn't fetch resource /redmine/javascripts/prototype.js?1296181549 to rewrite. [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine/login:12: Found script with src /redmine/javascripts/effects.js?1296181549 [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/apache/serf_url_async_fetcher.cc(632)] Initiating async fetch for http://sirius/redmine/javascripts/effects.js?1296181549 [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(29)] http://sirius/redmine/login: Couldn't fetch resource /redmine/javascripts/effects.js?1296181549 to rewrite. [Wed Feb 09 20:03:17 2011] [error] [0209/200317:ERROR:net/instaweb/util/google_message_handler.cc(54)] net/instaweb/apache/serf_url_async_fetcher.cc:506: http://sirius/redmine/login: Couldn't fetch resource /redmine/javascripts/effects.js?1296181549 to rewrite. [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine/login:13: Found script with src /redmine/javascripts/dragdrop.js?1296181549 [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/apache/serf_url_async_fetcher.cc(632)] Initiating async fetch for http://sirius/redmine/javascripts/dragdrop.js?1296181549 [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(29)] Creating connection [Wed Feb 09 20:03:17 2011] [error] [0209/200317:ERROR:net/instaweb/util/google_message_handler.cc(54)] net/instaweb/apache/serf_url_async_fetcher.cc:506: Creating connection [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine/login:14: Found script with src /redmine/javascripts/controls.js?1296181549 [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/apache/serf_url_async_fetcher.cc(632)] Initiating async fetch for http://sirius/redmine/javascripts/controls.js?1296181549 [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(29)] http://sirius/redmine/login: Couldn't fetch resource /redmine/javascripts/controls.js?1296181549 to rewrite. [Wed Feb 09 20:03:17 2011] [error] [0209/200317:ERROR:net/instaweb/util/google_message_handler.cc(54)] net/instaweb/apache/serf_url_async_fetcher.cc:506: Creating connection [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine/login:15: Found script with src /redmine/javascripts/application.js?1296181549 [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/apache/serf_url_async_fetcher.cc(632)] Initiating async fetch for http://sirius/redmine/javascripts/application.js?1296181549 [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(29)] Creating connection [Wed Feb 09 20:03:17 2011] [error] [0209/200317:ERROR:net/instaweb/util/google_message_handler.cc(54)] net/instaweb/apache/serf_url_async_fetcher.cc:506: Creating connection [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine/login:1: 8389us: HtmlParse::SanityCheck [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine/login:1: 8588us: HtmlParse::CoalesceAdjacentCharactersNodes [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine/login:1: 8701us: HtmlParse::ApplyFilter:InlineCss [Wed Feb 09 20:03:17 2011] [error] [0209/200317:ERROR:net/instaweb/util/google_message_handler.cc(54)] net/instaweb/apache/serf_url_async_fetcher.cc:506: 8701us: HtmlParse::ApplyFilter:InlineCss [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/apache/serf_url_async_fetcher.cc(632)] Initiating async fetch for http://sirius/redmine/stylesheets/application.css?1296181549 [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(29)] http://sirius/redmine/login: Couldn't fetch resource /redmine/stylesheets/application.css?1296181549 to rewrite. [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine/login:1: 9199us: HtmlParse::ApplyFilter:InlineJs [Wed Feb 09 20:03:17 2011] [error] [0209/200317:ERROR:net/instaweb/util/google_message_handler.cc(54)] net/instaweb/apache/serf_url_async_fetcher.cc:506: Creating connection [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/apache/serf_url_async_fetcher.cc(632)] Initiating async fetch for http://sirius/redmine/javascripts/prototype.js?1296181549 [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(29)] http://sirius/redmine/login: Couldn't fetch resource /redmine/javascripts/prototype.js?1296181549 to rewrite. [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/apache/serf_url_async_fetcher.cc(632)] Initiating async fetch for http://sirius/redmine/javascripts/effects.js?1296181549 [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(29)] Creating connectionhttp://sirius/redmine/login: Couldn't fetch resource /redmine/javascripts/effects.js?1296181549 to rewrite. [Wed Feb 09 20:03:17 2011] [error] [0209/200317:ERROR:net/instaweb/util/google_message_handler.cc(54)] net/instaweb/apache/serf_url_async_fetcher.cc:506: Creating connectionhttp://sirius/redmine/login: Couldn't fetch resource /redmine/javascripts/effects.js?1296181549 to rewrite. [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/apache/serf_url_async_fetcher.cc(632)] Initiating async fetch for http://sirius/redmine/javascripts/dragdrop.js?1296181549 [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(29)] http://sirius/redmine/login: Couldn't fetch resource /redmine/javascripts/dragdrop.js?1296181549 to rewrite. [Wed Feb 09 20:03:17 2011] [error] [0209/200317:ERROR:net/instaweb/util/google_message_handler.cc(54)] net/instaweb/apache/serf_url_async_fetcher.cc:506: Creating connection [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/apache/serf_url_async_fetcher.cc(632)] Initiating async fetch for http://sirius/redmine/javascripts/controls.js?1296181549 [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(29)] http://sirius/redmine/login: Couldn't fetch resource /redmine/javascripts/controls.js?1296181549 to rewrite. [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/apache/serf_url_async_fetcher.cc(632)] Initiating async fetch for http://sirius/redmine/javascripts/application.js?1296181549 [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(29)] http://sirius/redmine/login: Couldn't fetch resource /redmine/javascripts/application.js?1296181549 to rewrite. [Wed Feb 09 20:03:17 2011] [error] [0209/200317:ERROR:net/instaweb/util/google_message_handler.cc(54)] net/instaweb/apache/serf_url_async_fetcher.cc:506: Creating connection [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine/login:1: 11398us: HtmlParse::ApplyFilter:ImgRewrite [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine/login:1: 11506us: HtmlParse::ApplyFilter:CacheExtender [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/apache/serf_url_async_fetcher.cc(632)] Initiating async fetch for http://sirius/redmine/stylesheets/application.css?1296181549 [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(29)] http://sirius/redmine/login: Couldn't fetch resource /redmine/stylesheets/application.css?1296181549 to rewrite. [Wed Feb 09 20:03:17 2011] [error] [0209/200317:ERROR:net/instaweb/util/google_message_handler.cc(54)] net/instaweb/apache/serf_url_async_fetcher.cc:506: Creating connection [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/apache/serf_url_async_fetcher.cc(632)] Initiating async fetch for http://sirius/redmine/javascripts/prototype.js?1296181549 [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(29)] http://sirius/redmine/login: Couldn't fetch resource /redmine/javascripts/prototype.js?1296181549 to rewrite. [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/apache/serf_url_async_fetcher.cc(632)] Initiating async fetch for http://sirius/redmine/javascripts/effects.js?1296181549 [Wed Feb 09 20:03:17 2011] [error] [0209/200317:ERROR:net/instaweb/util/google_message_handler.cc(54)] net/instaweb/apache/serf_url_async_fetcher.cc:506: Creating connection [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(29)] http://sirius/redmine/login: Couldn't fetch resource /redmine/javascripts/effects.js?1296181549 to rewrite. [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/apache/serf_url_async_fetcher.cc(632)] Initiating async fetch for http://sirius/redmine/javascripts/dragdrop.js?1296181549 [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(29)] http://sirius/redmine/login: Couldn't fetch resource /redmine/javascripts/dragdrop.js?1296181549 to rewrite. [Wed Feb 09 20:03:17 2011] [error] [0209/200317:ERROR:net/instaweb/util/google_message_handler.cc(54)] net/instaweb/apache/serf_url_async_fetcher.cc:506: Creating connection [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/apache/serf_url_async_fetcher.cc(632)] Initiating async fetch for http://sirius/redmine/javascripts/controls.js?1296181549 [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(29)] http://sirius/redmine/login: Couldn't fetch resource /redmine/javascripts/controls.js?1296181549 to rewrite. [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/apache/serf_url_async_fetcher.cc(632)] Initiating async fetch for http://sirius/redmine/javascripts/application.js?1296181549 [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(29)] http://sirius/redmine/login: Couldn't fetch resource /redmine/javascripts/application.js?1296181549 to rewrite. [Wed Feb 09 20:03:17 2011] [error] [0209/200317:ERROR:net/instaweb/util/google_message_handler.cc(54)] net/instaweb/apache/serf_url_async_fetcher.cc:506: Creating connection [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/apache/serf_url_async_fetcher.cc(632)] Initiating async fetch for http://sirius/redmine/stylesheets/jstoolbar.css?1296181549 [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(29)] http://sirius/redmine/login: Couldn't fetch resource /redmine/stylesheets/jstoolbar.css?1296181549 to rewrite. [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine/login:1: 14401us: HtmlParse::ApplyFilter:HtmlWriter [Wed Feb 09 20:03:17 2011] [error] [0209/200317:ERROR:net/instaweb/util/google_message_handler.cc(54)] net/instaweb/apache/serf_url_async_fetcher.cc:506: Creating connection [Wed Feb 09 20:03:17 2011] [error] [0209/200317:ERROR:net/instaweb/util/google_message_handler.cc(54)] net/instaweb/apache/serf_url_async_fetcher.cc:506: Creating connection [Wed Feb 09 20:03:17 2011] [notice] [0209/200317:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine/login:1: 15218us: HtmlParse::FinishParse [Wed Feb 09 20:03:17 2011] [error] [0209/200317:ERROR:net/instaweb/util/google_message_handler.cc(54)] net/instaweb/apache/serf_url_async_fetcher.cc:506: Creating connection [Wed Feb 09 20:03:17 2011] [error] [0209/200317:ERROR:net/instaweb/util/google_message_handler.cc(54)] net/instaweb/apache/serf_url_async_fetcher.cc:506: Creating connection [Wed Feb 09 20:03:17 2011] [error] [0209/200317:ERROR:net/instaweb/util/google_message_handler.cc(54)] net/instaweb/apache/serf_url_async_fetcher.cc:506: Creating connection [Wed Feb 09 20:03:17 2011] [error] [0209/200317:ERROR:net/instaweb/util/google_message_handler.cc(54)] net/instaweb/apache/serf_url_async_fetcher.cc:506: Creating connection [Wed Feb 09 20:03:20 2011] [warn] [client 192.168.1.55] Not GET request: 2., referer: http://sirius/redmine/login [Wed Feb 09 20:03:20 2011] [notice] [0209/200320:INFO:net/instaweb/apache/mod_instaweb.cc(247)] ModPagespeed OutputFilter called for request /redmine/login [Wed Feb 09 20:03:20 2011] [notice] [0209/200320:INFO:net/instaweb/apache/mod_instaweb.cc(272)] unparsed=/redmine/login, absolute_url=http://sirius/redmine/login [Wed Feb 09 20:03:20 2011] [notice] [0209/200320:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine/login:1: HtmlParse::StartParse [Wed Feb 09 20:03:20 2011] [notice] [0209/200320:INFO:net/instaweb/apache/mod_instaweb.cc(299)] Request headers:\nHTTP/1.1 0 Internal Server Error\r\nHost: sirius\r\nUser-Agent: Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.8) Gecko/20100723 Ubuntu/10.04 (lucid) Firefox/3.6.8\r\nAccept: text/html,application/xhtml+xml,application/xml;q=0.9,/;q=0.8\r\nAccept-Language: en-us,en;q=0.5\r\nAccept-Encoding: gzip,deflate\r\nAccept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7\r\nKeep-Alive: 115\r\nConnection: keep-alive\r\nReferer: http://sirius/redmine/login\r\nCookie: _redmine_session=BAh7BzoPc2Vzc2lvbl9pZCIlNmVlMzFiMDc4MWQxZDU5ZTI5MTk2NjU0NGY3MzJmYzQ6EF9jc3JmX3Rva2VuIjFjRU1VWkhoUktKVTh3M3A2ZCt4UVFoSlRrNC9wbm56VWRnNWc1ZndoeERVPQ%3D%3D--8b195ac3cab88b5a1f408e3f18aaddc70782140e\r\nContent-Type: application/x-www-form-urlencoded\r\nContent-Length: 165\r\n\r\n [Wed Feb 09 20:03:20 2011] [notice] [0209/200320:INFO:net/instaweb/apache/mod_instaweb.cc(302)] request-filename=/var/www/redmine/login, uri=/redmine/login [Wed Feb 09 20:03:20 2011] [notice] [0209/200320:INFO:net/instaweb/apache/mod_instaweb.cc(319)] ModPagespeed Response headers:\nHTTP/1.1 302 Found\r\nLocation: http://sirius/redmine\r\nStatus: 302\r\nX-Mod-Pagespeed: 0.9.0.0-128\r\n\r\n [Wed Feb 09 20:03:20 2011] [notice] [0209/200320:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine/login:1: 604us: HtmlParse::Flush [Wed Feb 09 20:03:20 2011] [notice] [0209/200320:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine/login:1: 697us: HtmlParse::CoalesceAdjacentCharactersNodes [Wed Feb 09 20:03:20 2011] [notice] [0209/200320:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine/login:1: 758us: HtmlParse::ApplyFilter:AddHead [Wed Feb 09 20:03:20 2011] [notice] [0209/200320:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine/login:1: 813us: HtmlParse::SanityCheck [Wed Feb 09 20:03:20 2011] [notice] [0209/200320:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine/login:1: 912us: HtmlParse::CoalesceAdjacentCharactersNodes [Wed Feb 09 20:03:20 2011] [notice] [0209/200320:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine/login:1: 965us: HtmlParse::ApplyFilter:CssCombine [Wed Feb 09 20:03:20 2011] [notice] [0209/200320:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine/login:1: 1020us: HtmlParse::ApplyFilter:CssFilter [Wed Feb 09 20:03:20 2011] [notice] [0209/200320:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine/login:1: 1073us: HtmlParse::ApplyFilter:Javascript [Wed Feb 09 20:03:20 2011] [notice] [0209/200320:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine/login:1: 1125us: HtmlParse::ApplyFilter:InlineCss [Wed Feb 09 20:03:20 2011] [notice] [0209/200320:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine/login:1: 1179us: HtmlParse::ApplyFilter:InlineJs [Wed Feb 09 20:03:20 2011] [notice] [0209/200320:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine/login:1: 1233us: HtmlParse::ApplyFilter:ImgRewrite [Wed Feb 09 20:03:20 2011] [notice] [0209/200320:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine/login:1: 1285us: HtmlParse::ApplyFilter:CacheExtender [Wed Feb 09 20:03:20 2011] [notice] [0209/200320:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine/login:1: 1338us: HtmlParse::ApplyFilter:HtmlWriter [Wed Feb 09 20:03:20 2011] [notice] [0209/200320:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine/login:1: 1415us: HtmlParse::FinishParse [Wed Feb 09 20:03:20 2011] [notice] [0209/200320:INFO:net/instaweb/apache/mod_instaweb.cc(247)] ModPagespeed OutputFilter called for request /redmine [Wed Feb 09 20:03:20 2011] [notice] [0209/200320:INFO:net/instaweb/apache/mod_instaweb.cc(272)] unparsed=/redmine, absolute_url=http://sirius/redmine [Wed Feb 09 20:03:20 2011] [notice] [0209/200320:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine:1: HtmlParse::StartParse [Wed Feb 09 20:03:20 2011] [notice] [0209/200320:INFO:net/instaweb/apache/mod_instaweb.cc(299)] Request headers:\nHTTP/1.1 0 Internal Server Error\r\nHost: sirius\r\nUser-Agent: Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.8) Gecko/20100723 Ubuntu/10.04 (lucid) Firefox/3.6.8\r\nAccept: text/html,application/xhtml+xml,application/xml;q=0.9,/;q=0.8\r\nAccept-Language: en-us,en;q=0.5\r\nAccept-Encoding: gzip,deflate\r\nAccept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7\r\nKeep-Alive: 115\r\nConnection: keep-alive\r\nReferer: http://sirius/redmine/login\r\nCookie: _redmine_session=BAh7BzoMdXNlcl9pZGkGOg9zZXNzaW9uX2lkIiVlYjNmYTY5NmZjNzMwYTdhMjA5ZDJmZmM4MTM0MzcyMw%3D%3D--57a4931aae681664d2a6ff6c039ac84b6ebc9e55\r\nIf-None-Match: "76628aff953f11fbdefb77ce3d575718"\r\n\r\n [Wed Feb 09 20:03:20 2011] [notice] [0209/200320:INFO:net/instaweb/apache/mod_instaweb.cc(302)] request-filename=/var/www/redmine, uri=/redmine [Wed Feb 09 20:03:20 2011] [notice] [0209/200320:INFO:net/instaweb/apache/mod_instaweb.cc(319)] ModPagespeed Response headers:\nHTTP/1.1 200 OK\r\nStatus: 200\r\nX-Mod-Pagespeed: 0.9.0.0-128\r\n\r\n [Wed Feb 09 20:03:20 2011] [notice] [0209/200320:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine:1: 1870us: HtmlParse::Flush [Wed Feb 09 20:03:20 2011] [notice] [0209/200320:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine:1: 1973us: HtmlParse::CoalesceAdjacentCharactersNodes [Wed Feb 09 20:03:20 2011] [notice] [0209/200320:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine:1: 2040us: HtmlParse::ApplyFilter:AddHead [Wed Feb 09 20:03:20 2011] [notice] [0209/200320:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine:1: 2101us: HtmlParse::SanityCheck [Wed Feb 09 20:03:20 2011] [notice] [0209/200320:INFO:net/instaweb/util/google_message_handler.cc(48)] http://sirius/redmine:1: 2231us: HtmlParse::ApplyFilter:CssCombine [Wed Feb 09 20:03:20 2011] [notice] [0209/200320:INFO:net/instaweb/apache/serf_url_async_fetcher.cc(632)] Initiating async fetch for http://sirius/redmine/stylesheets/application.css?1296181549

    Read the article

  • virsh vcpu_period and vcpu_quota

    - by Programster
    I have been looking into ways to divide my CPU amongst KVM guests other than by just setting vCPU access limits. I understand the concept of cpu_shares which can be set/displayed with virsh schedinfo, but I also found vcpu_period and vcpu_quota listed with this command as shown below: Looking at the man page, I know what the acceptable input values are but could somebody please explain in simple terms what these two parameters actually do?

    Read the article

  • Bugzilla mail_delivery_method using TLS for Gmail

    - by Matt
    I installed the TLS as described in http://www.dawood.in/bugzilla%5Falerts%5Fusing%5Fgmail.html and verified that the package is installed. Restarted the Apache server. Log in as admin to the bugzilla. Gone to Administration - Parameters - Email I can't see the option for SMTP::TLS under mail_ delivery_method. Any ideas, Thanks, Matt

    Read the article

  • Best available technology for layered disk cache in linux

    - by SpliFF
    I've just bought a 6-core Phenom with 16G of RAM. I use it primarily for compiling and video encoding (and occassional web/db). I'm finding all activities get disk-bound and I just can't keep all 6 cores fed. I'm buying an SSD raid to sit between the HDD and tmpfs. I want to setup a "layered" filesystem where reads are cached on tmpfs but writes safely go through to the SSD. I want files (or blocks) that haven't been read lately on the SSD to then be written back to a HDD using a compressed FS or block layer. So basically reads: - Check tmpfs - Check SSD - Check HD And writes: - Straight to SSD (for safety), then tmpfs (for speed) And periodically, or when space gets low: - Move least frequently accessed files down one layer. I've seen a few projects of interest. CacheFS, cachefsd, bcache seem pretty close but I'm having trouble determining which are practical. bcache seems a little risky (early adoption), cachefs seems tied to specific network filesystems. There are "union" projects unionfs and aufs that let you mount filesystems over each other (USB device over a DVD usually) but both are distributed as a patch and I get the impression this sort of "transparent" mounting was going to become a kernel feature rather than a FS. I know the kernel has a built-in disk cache but it doesn't seem to work well with compiling. I see a 20x speed improvement when I move my source files to tmpfs. I think it's because the standard buffers are dedicated to a specific process and compiling creates and destroys thousands of processes during a build (just guessing there). It looks like I really want those files precached. I've read tmpfs can use virtual memory. In that case is it practical to create a giant tmpfs with swap on the SSD? I don't need to boot off the resulting layered filesystem. I can load grub, kernel and initrd from elsewhere if needed. So that's the background. The question has several components I guess: Recommended FS and/or block layer for the SSD and compressed HDD. Recommended mkfs parameters (block size, options etc...) Recommended cache/mount technology to bind the layers transparently Required mount parameters Required kernel options / patches, etc..

    Read the article

  • How to connect a Bluetooth network connection using the command line

    - by edg
    I can enable a Local Area Network interface for my machine with the command netsh interface set interface "Local Area Connection" ENABLED Is there an equivalent command to connect a bluetooth network connection? I've tried netsh interface set interface "Bluetooth" ENABLED but it seems to have no effect, the connection remains disconnected. I also tried netsh interface set interface "Bluetooth" connect=CONNECTED but this returns One or more essential parameters not specified

    Read the article

  • proxy: pass request body failed

    - by bux
    I'm trying to enable proxy virtualhost: <VirtualHost *:80> ServerName xxxxx.domain.tdl SSLProxyEngine On SSLProxyCheckPeerCN on ProxyPass / https://localhost:1234 ProxyPassReverse / https://localhost:1234 </VirtualHost> But i've an 500 err and my error.log (apache2) display: [Tue Jan 03 15:41:42 2012] [error] (502)Unknown error 502: proxy: pass request body failed to [::1]:1234 (localhost) [Tue Jan 03 15:41:42 2012] [error] proxy: pass request body failed to [::1]:1234 (localhost) from 82.252.xxx.xx () Missing some parameters ?

    Read the article

  • Apache2 - mod_expire and mod_rewrite not working in httpd.conf - serving content from tomcat

    - by Ankit Agrawal
    Hi, I am using apache2 server running on debian which forwards all the http request to tomcat installed on same machine. I have two files under my /etc/apache2/ folder apache2.conf and httpd.conf I modified httpd.conf file to look like following. # forward all http request on port 80 to tomcat ProxyPass / ajp://127.0.0.1:8009/ ProxyPassReverse / ajp://127.0.0.1:8009/ # gzip text content AddOutputFilterByType DEFLATE text/plain AddOutputFilterByType DEFLATE text/html AddOutputFilterByType DEFLATE text/xml AddOutputFilterByType DEFLATE text/css AddOutputFilterByType DEFLATE text/javascript AddOutputFilterByType DEFLATE application/xml AddOutputFilterByType DEFLATE application/xhtml+xml AddOutputFilterByType DEFLATE application/rss+xml AddOutputFilterByType DEFLATE application/javascript AddOutputFilterByType DEFLATE application/x-javascript DeflateCompressionLevel 9 BrowserMatch ^Mozilla/4 gzip-only-text/html BrowserMatch ^Mozilla/4\.0[678] no-gzip BrowserMatch \bMSIE !no-gzip !gzip-only-text/html # Turn on Expires and mark all static content to expire in a week # unset last modified and ETag ExpiresActive On ExpiresDefault A0 <FilesMatch "\.(jpg|jpeg|png|gif|js|css|ico)$" ExpiresDefault A604800 Header unset Last-Modified Header unset ETag FileETag None Header append Cache-Control "max-age=604800, public" </FilesMatch RewriteEngine On # rewrite all www.example.com/content/XXX-01.js and YYY-01.css files to XXX.js and YYY.css RewriteRule ^content/(js|css)/([a-z]+)-([0-9]+)\.(js|css)$ /content/$1/$2.$4 # remove all query parameters from URL after we are done with it RewriteCond %{THE_REQUEST} ^GET\ /.*\;.*\ HTTP/ RewriteCond %{QUERY_STRING} !^$ RewriteRule .* http://example.com%{REQUEST_URI}? [R=301,L] # rewrite all www.example.com to example.com RewriteCond %{HTTP_HOST} ^www\.example\.com$ [NC] RewriteRule ^(.*)$ http://example.com/$1 [R=301,L] I want to achieve following. forward all traffic to tomcat GZIP all the text content. Put 1 week expiry header to all static files and unset ETag/last modified header. rewrite all js and css file to certain format. remove all the query parameters from URL forward all www.example.com to example.com The problem is only 1 and 2 are working. I tried a lot with many combinations but the expire and rewrite rule (3-6) do not work at all. I also tried moving these rules to apache2.conf and .htaccess files but it didn't work either. It does not give any error but these rules are simple ignored. expires and rewrite modules are ENABLED. Please let me know what should I do to fix this. 1. Do I need to add something else in httpd.conf file (like Options +FollowSymLink) or something else? 2. Do I need to add something in apache2.conf file? 3. Do I need to move these rules to .htaccess file? If yes, what should I write in that file and where should I keep that file? in /etc/apache2/ folder or /var/www/ folder? 4. Any other info to make this work? Thanks, Ankit

    Read the article

  • netsh add profile returns "An attempt was made to load a program with an incorrect format."

    - by Tunisia
    Using netsh, I used the following command to add a wireless profile add profile filename="c:\profiles\DLINK-Profile.xml" interface="D-Link DWA-125 N150" All parameters are valid. But I get the error "An attempt was made to load a program with an incorrect format." I know this error is related to OS architecture 64bit or 32 bit. But i'm not sure which program is not compatible. I'm using Win7 64bit

    Read the article

  • Old Hardware Specifications

    - by billswift
    Anyone know where I can get specs for old monitors and soho computers? I'm setting up some old computers as linux boxes and especially need monitor specs for setting X Window System parameters. Right now I need specs for a Packard Bell 1024S monitor.

    Read the article

  • LTO(4) tape shelf life estimation?

    - by emilp
    LTO tapes, Maxell in this case, are often marketed as having 30 years or more shelf life when stored under "optimal conditions" Is there a way to get a good estimation to the shelf life, given parameters such as relative humidity and temperature etc? Obsolescence of the tapes aside, is there a way of determining the impact to shelf life of any deviance from the optimal. In other words how many years are lost when storing say 1 degree above the specified range? regards Emil

    Read the article

  • Postfix SMTP auth not working with virtual mailboxes + SASL + Courier userdb

    - by Greg K
    So I've read a variety of tutorials and how-to's and I'm struggling to make sense of how to get SMTP auth working with virtual mailboxes in Postfix. I used this Ubuntu tutorial to get set up. I'm using Courier-IMAP and POP3 for reading mail which seems to be working without issue. However, the credentials used to read a mailbox are not working for SMTP. I can see from /var/log/auth.log that PAM is being used, does this require a UNIX user account to work? As I'm using virtual mailboxes to avoid creating user accounts. li305-246 saslauthd[22856]: DEBUG: auth_pam: pam_authenticate failed: Authentication failure li305-246 saslauthd[22856]: do_auth : auth failure: [user=fred] [service=smtp] [realm=] [mech=pam] [reason=PAM auth error] /var/log/mail.log li305-246 postfix/smtpd[27091]: setting up TLS connection from mail-pb0-f43.google.com[209.85.160.43] li305-246 postfix/smtpd[27091]: Anonymous TLS connection established from mail-pb0-f43.google.com[209.85.160.43]: TLSv1 with cipher ECDHE-RSA-RC4-SHA (128/128 bits) li305-246 postfix/smtpd[27091]: warning: SASL authentication failure: Password verification failed li305-246 postfix/smtpd[27091]: warning: mail-pb0-f43.google.com[209.85.160.43]: SASL PLAIN authentication failed: authentication failure I've created accounts in userdb as per this tutorial. Does Postfix also use authuserdb? What debug information is needed to help diagnose my issue? main.cf: # TLS parameters smtpd_tls_cert_file = /etc/ssl/certs/smtpd.crt smtpd_tls_key_file = /etc/ssl/private/smtpd.key smtpd_use_tls=yes smtpd_tls_session_cache_database = btree:${data_directory}/smtpd_scache smtp_tls_session_cache_database = btree:${data_directory}/smtp_scache # SMTP parameters smtpd_sasl_local_domain = smtpd_sasl_auth_enable = yes smtpd_sasl_security_options = noanonymous broken_sasl_auth_clients = yes smtpd_recipient_restrictions = permit_sasl_authenticated,permit_mynetworks,reject_unauth_destination smtp_tls_security_level = may smtpd_tls_security_level = may smtpd_tls_auth_only = no smtp_tls_note_starttls_offer = yes smtpd_tls_CAfile = /etc/ssl/certs/cacert.pem smtpd_tls_loglevel = 1 smtpd_tls_received_header = yes smtpd_tls_session_cache_timeout = 3600s tls_random_source = dev:/dev/urandom /etc/postfix/sasl/smtpd.conf pwcheck_method: saslauthd mech_list: plain login /etc/default/saslauthd START=yes PWDIR="/var/spool/postfix/var/run/saslauthd" PARAMS="-m ${PWDIR}" PIDFILE="${PWDIR}/saslauthd.pid" DESC="SASL Authentication Daemon" NAME="saslauthd" MECHANISMS="pam" MECH_OPTIONS="" THREADS=5 OPTIONS="-c -m /var/spool/postfix/var/run/saslauthd" /etc/courier/authdaemonrc authmodulelist="authuserdb" I've only modified one line in authdaemonrc and restarted the service as per this tutorial. I've added accounts to /etc/courier/userdb via userdb and userdbpw and run makeuserdb as per the tutorial. SOLVED Thanks to Jenny D for suggesting use of rimap to auth against localhost IMAP server (which reads userdb credentials). I updated /etc/default/saslauthd to start saslauthd correctly (this page was useful) MECHANISMS="rimap" MECH_OPTIONS="localhost" THREADS=0 OPTIONS="-c -m /var/spool/postfix/var/run/saslauthd -r" After doing this I got the following error in /var/log/auth.log: li305-246 saslauthd[28093]: auth_rimap: unexpected response to auth request: * BYE [ALERT] Fatal error: Account's mailbox directory is not owned by the correct uid or gid: li305-246 saslauthd[28093]: do_auth : auth failure: [user=fred] [service=smtp] [realm=] [mech=rimap] [reason=[ALERT] Unexpected response from remote authentication server] This blog post detailed a solution by setting IMAP_MAILBOX_SANITY_CHECK=0 in /etc/courier/imapd. Then restart your courier and saslauthd daemons for config changes to take effect. sudo /etc/init.d/courier-imap restart sudo /etc/init.d/courier-authdaemon restart sudo /etc/init.d/saslauthd restart Watch /var/log/auth.log while trying to send email. Hopefully you're good!

    Read the article

  • How to do an automated installation of Ubuntu on 100 Remote machines?

    - by user40876
    Help!  I desperately need some advice / help... I want an automated install (via CD or USB) of Ubuntu 10.04 ...on 100 remote machines located all over the country, using a Kickstart configuration file accessible from my web server. How do I create the boot CD (or USB)? How do I specifically add the boot parameters to that boot CD (or USB) to tell it the URL to use for it's automated Kickstart install?

    Read the article

  • optimizing operating systems to provide maximum informix performance.

    - by Frank Developer
    Are there any Informix-specific guides for optimizing any operating system where an ifx engine is running? For example, in Linux, strip-down to a bare minimum all unecessary binaries, daemons, utilities, tune kernel parameters, optimize raw and cooked devices (hdparm). Someday, maybe, informix can create its own proprietary PICK-like O/S. The general idea is for the OS where ifx sits on have the smallest footprint, lowest overhead impact on ifx and provide optimized ifx performance.

    Read the article

< Previous Page | 109 110 111 112 113 114 115 116 117 118 119 120  | Next Page >