Search Results

Search found 22951 results on 919 pages for 'debug build'.

Page 96/919 | < Previous Page | 92 93 94 95 96 97 98 99 100 101 102 103  | Next Page >

  • PostSharp causes error with .Net 4.0 when building

    - by Kev Hunter
    When trying to build a project which used PostSharp 1.5 on our Ci Server we get the following error; C:\Program Files\PostSharp 1.5\PostSharp-1.5.targets (261,5): error: Unhandled exception: PostSharp.CodeModel.BindingException: Cannot find the type 'System.Action`2' in assembly 'System.Core, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089'. at PostSharp.CodeModel.AssemblyEnvelope.GetTypeDefinition(String typeName, BindingOptions bindingOptions) at PostSharp.CodeModel.TypeRefDeclaration.GetTypeDefinition(BindingOptions bindingOptions) at PostSharp.CodeModel.TypeSpecDeclaration.GetTypeDefinition(BindingOptions bindingOptions) at PostSharp.Extensibility.Tasks.IndexGenericInstancesTask.Execute() at PostSharp.Extensibility.Project.ExecutePhase(String phase) at PostSharp.Extensibility.Project.Execute() at PostSharp.Extensibility.PostSharpObject.ExecuteProjects() at PostSharp.Extensibility.PostSharpObject.InvokeProject(ProjectInvocation projectInvocation) at PostSharp.MSBuild.PostSharpRemoteTask.Execute(PostSharpTaskParameters parameters, TaskLoggingHelper log) It has previously worked on .net 3.5. What's the best way to fix this?

    Read the article

  • javamail error :must issue starttls command first

    - by bobby
    im trying to send a mail using javamail api using the below code:when i compiled the class file im getting the below error which says 'must issue starttls command first' i have mentioned the error below. and also getProvider() function error i think so...i dont know what the errors mean. import javax.servlet.*; import javax.servlet.http.*; import java.io.*; import javax.mail.*; import javax.mail.internet.*; import javax.mail.event.*; import javax.mail.Authenticator; import java.net.*; import java.util.Properties; public class mailexample { public static void main (String args[]) throws Exception { String from = args[0]; String to = args[1]; try { Properties props=new Properties(); props.put("mail.transport.protocol", "smtp"); props.put("mail.smtp.host","smtp.gmail.com"); props.put("mail.smtp.port", "25"); props.put("mail.smtp.auth", "true"); javax.mail.Authenticator authenticator = new javax.mail.Authenticator() { protected javax.mail.PasswordAuthentication getPasswordAuthentication() { return new javax.mail.PasswordAuthentication("[email protected]", "pass"); } }; Session sess=Session.getDefaultInstance(props,authenticator); sess.setDebug (true); Transport transport =sess.getTransport ("smtp"); Message msg=new MimeMessage(sess); msg.setFrom(new InternetAddress(from)); msg.addRecipient(Message.RecipientType.TO, new InternetAddress(to)); msg.setSubject("Hello JavaMail"); msg.setText("Welcome to JavaMail"); transport.connect(); transport.send(msg); } catch(Exception e) { System.out.println("err"+e); } } } error: C:\Users\bobby\Desktopjava mailexample [email protected] abc@gmail. com DEBUG: getProvider() returning javax.mail.Provider[TRANSPORT,smtp,com.sun.mail.s mtp.SMTPTransport,Sun Microsystems, Inc] DEBUG SMTP: useEhlo true, useAuth true DEBUG SMTP: useEhlo true, useAuth true DEBUG: SMTPTransport trying to connect to host "smtp.gmail.com", port 25 DEBUG SMTP RCVD: 220 mx.google.com ESMTP q10sm12956046rvp.20 DEBUG: SMTPTransport connected to host "smtp.gmail.com", port: 25 DEBUG SMTP SENT: EHLO bobby-PC DEBUG SMTP RCVD: 250-mx.google.com at your service, [60.243.184.29] 250-SIZE 35651584 250-8BITMIME 250-STARTTLS 250 ENHANCEDSTATUSCODES DEBUG: getProvider() returning javax.mail.Provider[TRANSPORT,smtp,com.sun.mail.s mtp.SMTPTransport,Sun Microsystems, Inc] DEBUG SMTP: useEhlo true, useAuth true DEBUG SMTP: useEhlo true, useAuth true DEBUG: SMTPTransport trying to connect to host "smtp.gmail.com", port 25 DEBUG SMTP RCVD: 220 mx.google.com ESMTP l29sm12930755rvb.16 DEBUG: SMTPTransport connected to host "smtp.gmail.com", port: 25 DEBUG SMTP SENT: EHLO bobby-PC DEBUG SMTP RCVD: 250-mx.google.com at your service, [60.243.184.29] 250-SIZE 35651584 250-8BITMIME 250-STARTTLS 250 ENHANCEDSTATUSCODES DEBUG SMTP SENT: MAIL FROM: DEBUG SMTP RCVD: 530 5.7.0 Must issue a STARTTLS command first. l29sm12930755rvb .16 DEBUG SMTP SENT: QUIT errjavax.mail.SendFailedException: Sending failed; nested exception is: javax.mail.MessagingException: 530 5.7.0 Must issue a STARTTLS command f irst. l29sm12930755rvb.16

    Read the article

  • Problem building dotnetnuke blog module

    - by GeminiDNK
    I have dnn 5.2 running on my machine, and installed the blog module, the blog module is also reflected in the development environment i have. But i want to find out, how do i actually develope, or build the blog from source, so i can actually output the .dll file, instead of just running it. If i import the Blog project file from the folder, it will give me a host of error like .resx file not found and so on. Does any dotnetnuke expert out there know about this?And have encountered the problem before? If not, how do you actually make the module work so u can make changes to the module. Im talking about core logic change, like adding functions, not just skin or UI changes.

    Read the article

  • I have a error building a .vdproj on msbuild with nant

    - by Luís Custódio
    I'm getting used to using nant for build releases. But I have started to use asp.net MVC, and i choice make the setup for installation with a .vdproj . But, when I call the: < exec program="${dotnet.dir}/msbuild.exe" commandline='"./Wum.sln" /v:q /nologo /p:Configuration=Release' / in nant, my result is: [exec] D:\My Documents\Visual Studio 2008\Projects\Wum\Wum.sln : warning MS B4078: The project file "Wum.Setup\Wum.Setup.vdproj" is not supported by MSBuild and cannot be built. Someone have some clue, or a solution? If I use the devenv, I'll have a problem? Thanks in advance.

    Read the article

  • How to override ant task stored in ant lib directory

    - by mchr
    At my work we use AspectJ in some of our Java projects. To get this to work with ant builds we have been placing aspectjtools.jar within ant/lib/. I am now working on a particular Java project and need to use a newer version of aspectJ. I don't want to have to get everyone who uses the project to update their local copy of aspectjtools.jar. Instead, I tried adding the newer aspectjtools.jar to the lib directory of the project and adding the following line to build.xml. <taskdef resource="org/aspectj/tools/ant/taskdefs/aspectjTaskdefs.properties" classpath="./lib/aspectjtools.jar" /> However, this doesn't work as I hoped as the ANT classloader loads jars from ant/lib/ in preference to the jar I specify in the taskdef classpath. Is there any way to force ant to pick the jar checked into my project instead?

    Read the article

  • How to add a whole directory or project output to WiX package

    - by Coincoin
    We decided to switch from VS integrated setup to WiX. However, what we currently do is use projects output files as the input for the setup project. This lets us easily add Application Files to a directory (for images, samples, and other resources...) and those files are automatically added to the setup when we build. I could not find any similar feature in WiX. WiX seems to require one Directory entry and one File entry for each and every directory and file. This would require us to change the WiX source everytime a file is added which, to my eyes, is prohibitive since we have so many of them. Is there any integrated way of doing that with WiX or do I have to write my own task that will create a WiX source before calling candle?

    Read the article

  • Error while deploying a web application in OSGI container using pax web

    - by RaulDM
    Hello I am trying to deploy a web application in a Felix container. I have all the required configuration done with my web app like the setting up of the manifest headers: Webapp-Context: Bundle-ClassPath: Bundle-Activator: Import-Package: Bundle-SymbolicName: etc The Pax bundles that I have dropped in the same container are: pax-web-service-0.6.0.jar pax-web-jsp-0.7.1.jar pax-web-extender-war-0.7.1.jar pax-logging-service-1.5.0.jar pax-logging-api-1.5.0.jar Though it had been written in the pax web site that pax-web-service is included in pax-war-extender, it seems without pax-web-service bundle, all other bundles become handicapped. I had removed the other pax bundles like pax-web-extender-whiteboard-0.7.1.jar pax-web-jetty-0.7.1.jar, as I have not seen any usefulness of those. The pax-web-jetty-0.7.1.jar even does not get start up. it has dependencies which it could not be able to resolve from any one of the bundle provided by PAX. My browser is displaying: HTTP ERROR 403 Problem accessing /adminmodule/. Reason: FORBIDDEN Powered by Jetty:// while the Console log says: [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.mortbay.jetty - REQUEST /adminmodule/ on org.mortbay.jetty.HttpConnection@1e94001 [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.ops4j.pax.web.service.internal.model.ServerModel - Matching [/adminmodule/]... [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.ops4j.pax.web.service.internal.model.ServerModel - Path [/adminmodule/] matched to {pattern=/adminmodule/.*,model=ResourceModel{id=org.ops4j.pax.web.service.internal.model.ResourceModel-2,name=,urlPatterns=[/],alias=/,servlet=ResourceServlet{context=/adminmodule,alias=/,name=},initParams={},context=ContextModel{id=org.ops4j.pax.web.service.internal.model.ContextModel-1,name=adminmodule,httpContext=org.ops4j.pax.web.extender.war.internal.WebAppWebContainerContext@11710be,contextParams={webapp.context=adminmodule}}}} [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.ops4j.pax.web.service.internal.HttpServiceContext - Handling request for [/adminmodule/] using http context [org.ops4j.pax.web.extender.war.internal.WebAppWebContainerContext@11710be] [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.mortbay.jetty - sessionManager=org.mortbay.jetty.servlet.HashSessionManager@19c6163 [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.mortbay.jetty - session=null [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.mortbay.jetty - servlet= [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.mortbay.jetty - chain=org.ops4j.pax.web.service.internal.model.FilterModel-3- [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.mortbay.jetty - servlet holder= [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.mortbay.jetty - call filter org.ops4j.pax.web.service.internal.model.FilterModel-3 [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.ops4j.pax.web.service.internal.WelcomeFilesFilter - Apply welcome files filter... [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.ops4j.pax.web.service.internal.WelcomeFilesFilter - Servlet path: / [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.ops4j.pax.web.service.internal.WelcomeFilesFilter - Path info: null [5884890@qtp-16567002-0 - /adminmodule/] INFO org.ops4j.pax.web.service.internal.HttpServiceContext - getting resource: [/adminmodule.jsp] [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.ops4j.pax.web.extender.war.internal.WebAppWebContainerContext - Searching bundle [com.cisco.zaloni.gwt.admin [1]] for resource [/adminmodule.jsp], normalized to [adminmodule.jsp] [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.ops4j.pax.web.extender.war.internal.WebAppWebContainerContext - Resource not found [5884890@qtp-16567002-0 - /adminmodule/] INFO org.ops4j.pax.web.service.internal.HttpServiceContext - found resource: null [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.mortbay.jetty - call servlet [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.ops4j.pax.web.extender.war.internal.WebAppWebContainerContext - Searching bundle [com.cisco.zaloni.gwt.admin [1]] for resource [/], normalized to [/] [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.ops4j.pax.web.extender.war.internal.WebAppWebContainerContext - Resource found as url [bundle://1.0:1/] [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.mortbay.jetty - RESPONSE /adminmodule/ 403 It is really frustrating. please help. as I am new to OSGI. Raul

    Read the article

  • maintaining a growing, diverse codebase with continuous integration

    - by Nate
    I am in need of some help with philosophy and design of a continuous integration setup. Our current CI setup uses buildbot. When I started out designing it, I inherited (well, not strictly, as I was involved in its design a year earlier) a bespoke CI builder that was tailored to run the entire build at once, overnight. After a while, we decided that this was insufficient, and started exploring different CI frameworks, eventually choosing buildbot. One of my goals in transitioning to buildbot (besides getting to enjoy all the whiz-bang extras) was to overcome some of the inadequacies of our bespoke nightly builder. Humor me for a moment, and let me explain what I have inherited. The codebase for my company is almost 150 unique c++ Windows applications, each of which has dependencies on one or more of a dozen internal libraries (and many on 3rd party libraries as well). Some of these libraries are interdependent, and have depending applications that (while they have nothing to do with each other) have to be built with the same build of that library. Half of these applications and libraries are considered "legacy" and unportable, and must be built with several distinct configurations of the IBM compiler (for which I have written unique subclasses of Compile), and the other half are built with visual studio. The code for each compiler is stored in two separate Visual SourceSafe repositories (which I am simply handling using a bunch of ShellCommands, as there is no support for VSS). Our original nightly builder simply took down the source for everything, and built stuff in a certain order. There was no way to build only a single application, or pick a revision, or to group things. It would launched virtual machines to build a number of the applications. It wasn't very robust, it wasn't distributable. It wasn't terribly extensible. I wanted to be able to overcame all of these limitations in buildbot. The way I did this originally was to create entries for each of the applications we wanted to build (all 150ish of them), then create triggered schedulers that could build various applications as groups, and then subsume those groups under an overall nightly build scheduler. These could run on dedicated slaves (no more virtual machine chicanery), and if I wanted I could simply add new slaves. Now, if we want to do a full build out of schedule, it's one click, but we can also build just one application should we so desire. There are four weaknesses of this approach, however. One is our source tree's complex web of dependencies. In order to simplify config maintenace, all builders are generated from a large dictionary. The dependencies are retrieved and built in a not-terribly robust fashion (namely, keying off of certain things in my build-target dictionary). The second is that each build has between 15 and 21 build steps, which is hard to browse and look at in the web interface, and since there are around 150 columns, takes forever to load (think from 30 seconds to multiple minutes). Thirdly, we no longer have autodiscovery of build targets (although, as much as one of my coworkers harps on me about this, I don't see what it got us in the first place). Finally, aformentioned coworker likes to constantly bring up the fact that we can no longer perform a full build on our local machine (though I never saw what that got us, either, considering that it took three times as long as the distributed build; I think he is just paranoically phobic of ever breaking the build). Now, moving to new development, we are starting to use g++ and subversion (not porting the old repository, mind you - just for the new stuff). Also, we are starting to do more unit testing ("more" might give the wrong picture... it's more like any), and integration testing (using python). I'm having a hard time figuring out how to fit these into my existing configuration. So, where have I gone wrong philosophically here? How can I best proceed forward (with buildbot - it's the only piece of the puzzle I have license to work on) so that my configuration is actually maintainable? How do I address some of my design's weaknesses? What really works in terms of CI strategies for large, (possibly over-)complex codebases?

    Read the article

  • Bamboo Versioning

    - by reddy
    Hello guyz, I have a situation where i need to maintain version information of my builds. By googling i found limited information. one way is to create a version file on source control and keep updating. other is to use the source control revision number. final one is to use bamboo build number. i haven't implemented anyone of this before. colud anyone point out the pros and cons of each method. Thank you, Reddy. Please atleast tell me which method have u used to implement the same. Thnq..

    Read the article

  • gcc std=gnu++0x option

    - by Neeraj
    Hi everyone, I need to compile a C++ code that uses std=gnu++0x option to the g++ compiler in the Makefile.am , As this option is compatible only with gcc 4.3 and above, the build crashes on my machine where i have gcc 4.2. What are my alternatives ? I tried removing that option from the Makefile.am but that reports some other error. Do i need to install gcc 4.3 or above? How can I do it in ubuntu hardy through apt-get ? Thanks.

    Read the article

  • What are some tips for troubleshooting builds of complicated software?

    - by Goose Bumper
    Sometimes I want to build Python or GCC from scratch just for fun, but I can't parse the errors I get, or don't understand statements like "libtool link error # XYZ". What are some tricks that unix/systems gurus use to compile software of this size from scratch? Of course I already do things like read config.log (if there is one), google around, and post in newsgroups. I'm looking for things that either make the process go smoother or get me more information about the error to help me understand and fix it. It's a little tough to get this information sometimes, because some compile bugs can be quite obscure. What can I do at that point?

    Read the article

  • Change a localized InfoPlist.strings using an Xcode target

    - by nevan
    Here's an obscure problem. I'm using an InfoPlist.strings to localize my app name. It's only got one value: CFBundleDisplayName = "Mon App". The strings file is localized (putting it in a directory for that localization). I've just made an extra target, where I change things like the non-localized app name (different Info.plists), and the icon. I'm also changing the Default.png using a run script build phase (copying different files depending on the app type I'm building). I've tried using the script to copy different versions of my InfoPlist.strings, but I couldn't make it work. Here's what I used: if($TARGET_NAME == "MonApp")then cp fr.lproj/MonApp_InfoPlist.strings fr.lproj/InfoPlist.strings endif I've seen a post suggesting wincent strings util for processing strings, but wanted to see if there's an easy way to do this. Any help greatly appreciated.

    Read the article

  • Partial compilation of openwrt project

    - by yosig81
    I would like to get an idea or reference to compile only subset on the openwrt project. i am aware of the menuconfig utility but this is not enough for my goal. i would like to compile only the tool-chain (binutils + gcc + glibc) for a specific target (ar71xx) and also the kernel. now, after looking in the makefiles etc, i have noticed that most of the work in actually patching the toolchain and the kernel and then compile it. is there any option to stop build process after the patching so i can have only the source code patched and i can write my own make file to compile it?

    Read the article

  • How can I use an Ant foreach iteration with values from a file?

    - by Egon Willighagen
    In our Ant build environment, I have to do the same task for a number of items. The AntContrib foreach task is useful for that. However, the list is in a parameter, where I actually have the list in a file. How can I iterate over items in a file in an foreach-like way in Ant? Something like (pseudo-code): <foreach target="compile-module" listFromFile="$fileWithModules"/> I'm happy to write a custom Task, and welcome any suggestion on possible solutions.

    Read the article

  • How to get the git commit count?

    - by Splo
    I'd like to get the number of commits of my git repository, a bit like SVN revision numbers. The goal is to use it as a unique, incrementing build number. I currently do like that, on Unix/Cygwin/msysGit: git log --pretty=format:'' | wc -l But I feel it's a bit of a hack. Is there a better way to do that? It would be cool if I actually didn't need wc or even git, so it could work on a bare Windows. Just read a file or a directory structure ...

    Read the article

  • Building elf within Eclipse within Windows

    - by BSchlinker
    Hey guys, I'm having trouble building an Elf file within Eclipse within Windows. It seems that everytime I build, a PE / portable executable for windows is created. I've gone into the Binary Parser section and checked Elf Parser while making sure that everything else is unchecked. However, I continue to end up with a PE which I cannot run on Linux. For clarification, I'm using the Linux GCC toolchain within Eclipse. I've attempted a reinstall of Cygwin -- still experiencing the same issues. Any ideas? Thanks

    Read the article

  • SVN Externals in a different SCM

    - by Sean Chambers
    At a previous workplace we used svn externals to update dependent projects when a shared component was updated. This made it easy to see anything that those changes broke, as well as update dependent projects to the latest version of a shared component automatically without any intervention. At a new workplace we are using cc.net with surround scm and I'm trying to find something similar in surround. I haven't found anything like externals, only "shared files", but unlike externals, the shared files doesn't allow you to point at a specific revision of a file for the external. I'm interested in what other people are doing in these scenarios to lean on their continuous integration and treat it more for integration than a "continuous build" server. Does anyone know of a tool or something to do "externals" behavior without using svn? I suppose having an xml registry file of which projects depend on which assemblies and if they should be using the latest version but this seems like overkill.

    Read the article

  • Fixing lots of broken references in a working asp.net mvc project

    - by davidbuttrick
    The last time I worked on this project everything was fine. That was about 4 days ago. Now, when I open the project, all the references to .Net are not working, I cannot build my project any more. I have tried following the advice in posts here, but to no avail. Even simple things, like Request.cookies - Request is underlined in curlies, and I get 'Request is undefined' when I roll over it. That doesnt seem like I need to just remove and recreate the reference to System.Web.Mvc - which I have tried, and had no luck. Any ideas? Surely there are other issues that can cause this problem... Thank you.

    Read the article

  • Multi-process builds in Visual Studio 2010: Worth it?

    - by coryr
    I've started testing our C++ software with VS2010 and the build times are really bad (30-45 minutes, about double the VS2005 times). I've been reading about the /MP switch for multi-process compilation. Unfortunately, it is incompatible with some features that we use quite a bit like #import, incremental compilation, and precompiled headers. Have you had a similar project where you tried the /MP switch after turning off things like precompiled headers? Did you get faster builds? My machine is running 64-bit Windows 7 on a 4 core machine with 4 GB of RAM and a fast SSD storage. Virus scanner disabled and a pretty minimal software environment.

    Read the article

  • Ant Tokenizer: Selecting an individual Token

    - by John Oxley
    I have the following ant task: <loadfile property="proj.version" srcfile="build.py"> <filterchain> <striplinecomments> <comment value="#"/> </striplinecomments> <linecontains> <contains value="Version" /> </linecontains> </filterchain> </loadfile> <echo message="${proj.version}" /> And the output is [echo] config ["Version"] = "v1.0.10-r4.2" How do I then use a tokenizer to get only v1.0.10-r4.2, the equivalent of | cut -d'"' -f4

    Read the article

  • Bulding an multi-platform SWT application using Ant

    - by Mridang Agarwalla
    I'm writing an SWT application which can be used on Windows (32/64 bit) and Mac OSX (32/64 bit). Apart from the JRE I rely on the SWT library found here. I can find four versions of the SWT library depending upon my target platforms (as mentioned above). When building my application, how can I compile using the correct SWT Jar? If possible, I'd like to try and avoid hard-coding the Jar version, platform and architecture. The SWT Jars are named like this: swt-win32-x86_64.jar swt-win32-x86_32.jar swt-macosx-x86_32.jar swt-macosx-x86_64.jar (My project will be an open source project. I'd like people to be able to download the source and build it and therefore I've thought of including all the four versions of the SWT Jars in the source distribution. I hope this is the correct approach of publishing code relying on third-part libraries.) Thanks everyone.

    Read the article

  • Qt compilation and stylesheet

    - by Yosko
    Each time I compile my Qt project after modifying my qss stylesheet file, the modifications aren't taken into account, unless I rebuild everything. Any idea on a workaround for this, so that I don't have to wait 5 minutes each time I change my qss ? Notes: I use Qt 4.8, and my stylsheet is declared in a resource file (qrc). EDIT: As suggested by Luca Carlon, when a qss is reference in the project through a .qrc file, the changes in the qss don't affect the qrc, and the compiler ignores it. To avoid that, I added a Custom Build Step to my project: before the qmake step! calls a .bat file without any argument the .bat contains the real command copy /b files.qrc +,,

    Read the article

  • How to program three editions Light, Pro, Ultimate in one solution

    - by Henry99
    I'd like to know how best to program three different editions of my C# ASP.NET 3.5 application in VS2008 Professional (which includes a web deployment project). I have a Light, Pro and Ultimate edition (or version) of my application. At the moment I've put all in one solution with three build versions in configuration manager and I use preprocessor directives all over the code (there are around 20 such constructs in some ten thousand lines of code, so it's overseeable): #if light //light code #endif #if pro //pro code #endif //etc... I've read in stackoverflow for hours and thought to encounter how e.g. Microsoft does this with its different Windows editions, but did not find what I expected. Somewhere there is a heavy discussion about if preprocessor directives are evil. What I like with those #if-directives is: the side-by-side code of differences, so I will understand the code for the different editions after six months and the special benefit to NOT give out compiled code of other versions to the customer. OK, long explication, repeated question: What's the best way to go?

    Read the article

  • frequent updates of a Tomcat application

    - by Erel Segal Halevi
    I have an application that runs on a Tomcat 7 server on a Windows machine. In its current stage, I have to frequently update and fix it. Whenever I need to update the application, I do all this: Build a new war file; Go to the Windows server, stop the Tomcat service; download the file, put it under webapps; Remove the old application folder under webapps; Remove the old application folder under work/Catalina/localhost (otherwise it keeps the old version cached). Restart the Tomcat service. I am sure there is a way to do all this automatically. What is it?

    Read the article

  • Building SL4 + RIAServices app takes too long on VS2010.

    - by adlanelm
    Got a Win7 box with VS2010 Premium installed on it. Building desktop apps works just fine. But we got this solution with 15 SL4 and 21 desktop projects... Building the SL part of it takes too long. This is very irritating and encourages to drop TDD since every time I run a test it takes ~3 seconds for msbuild to find out that nothing changed and the project should be skipped. The projects are very small and there's nothing fancy in them and we hadn't any problems before we switched from VS2008+SL3. I've heard people complaining abound VS2010 speed in general, but nothing about SL4 build time. Is anyone experiencing same problems and is there any workaround for this?

    Read the article

< Previous Page | 92 93 94 95 96 97 98 99 100 101 102 103  | Next Page >