Search Results

Search found 26098 results on 1044 pages for 'self build'.

Page 16/1044 | < Previous Page | 12 13 14 15 16 17 18 19 20 21 22 23  | Next Page >

  • Is there a tool to build and test a local change on multiple platforms

    - by Ben
    A company I used to work for was plagued with build breakages. So they made a tool that would zip up a developers local changes (which it detected from SCM) and send them to a remote server for a test build. The remote server would update its copy of the source with the repository and then apply the changes it received from the developer. It would then build and test the changes. We actually targeted multiple platforms so it would do the above for each of those platforms. When it was done, if everything was green, the developer was reasonable confident they could submit the change without breaking the "real" build. Are there any tools out there that do something similar?

    Read the article

  • How to get code of different version of a web application build on tfs 2008 server.

    - by CHAMPION
    Hi, I have been created a web project on tfs server and set a build for this application, which builds the application daily. i want to give a specific version of build to testing team, but if that version was build successfully before two or three days, how can i get the source code of that particular build which was build successfully a few days before. Thanks and regards CHAMPION

    Read the article

  • UppercuT &ndash; Custom Extensions Now With PowerShell and Ruby

    - by Robz / Fervent Coder
    Arguably, one of the most powerful features of UppercuT (UC) is the ability to extend any step of the build process with a pre, post, or replace hook. This customization is done in a separate location from the build so you can upgrade without wondering if you broke the build. There is a hook before each step of the build has run. There is a hook after. And back to power again, there is a replacement hook. If you don’t like what the step is doing and/or you want to replace it’s entire functionality, you just drop a custom replacement extension and UppercuT will perform the custom step instead. Up until recently all custom hooks had to be written in NAnt. Now they are a little sweeter because you no longer need to use NAnt to extend UC if you don’t want to. You can use PowerShell. Or Ruby.   Let that sink in for a moment. You don’t have to even need to interact with NAnt at all now. Extension Points On the wiki, all of the extension points are shown. The basic idea is that you would put whatever customization you are doing in a separate folder named build.custom. Each step Let’s take a look at all we can customize: The start point is default.build. It calls build.custom/default.pre.build if it exists, then it runs build/default.build (normal tasks) OR build.custom/default.replace.build if it exists, and finally build.custom/default.post.build if it exists. Every step below runs with the same extension points but changes on the file name it is looking for. NOTE: If you include default.replace.build, nothing else will run because everything is called from default.build.    * policyChecks.step    * versionBuilder.step NOTE: If you include build.custom/versionBuilder.replace.step, the items below will not run.      - svn.step, tfs.step, or git.step (the custom tasks for these need to go in build.custom/versioners)    * generateBuildInfo.step    * compile.step    * environmentBuilder.step    * analyze.step NOTE: If you include build.custom/analyze.replace.step, the items below will not run.      - test.step (the custom tasks for this need to go in build.custom/analyzers) NOTE: If you include build.custom/analyzers/test.replace.step, the items below will not run.        + mbunit2.step, gallio.step, or nunit.step (the custom tasks for these need to go in build.custom/analyzers)      - ncover.step (the custom tasks for this need to go in build.custom/analyzers)      - ndepend.step (the custom tasks for this need to go in build.custom/analyzers)      - moma.step (the custom tasks for this need to go in build.custom/analyzers)    * package.step NOTE: If you include build.custom/package.replace.step, the items below will not run.      - deploymentBuilder.step Customize UppercuT Builds With PowerShell UppercuT can now be extended with PowerShell (PS). To customize any extension point with PS, just add .ps1 to the end of the file name and write your custom tasks in PowerShell. If you are not signing your scripts you will need to change a setting in the UppercuT.config file. This does impose a security risk, because this allows PS to now run any PS script. This setting stays that way on ANY machine that runs the build until manually changed by someone. I’m not responsible if you mess up your machine or anyone else’s by doing this. You’ve been warned. Now that you are fully aware of any security holes you may open and are okay with that, let’s move on. Let’s create a file called default.replace.build.ps1 in the build.custom folder. Open that file in notepad and let’s add this to it: write-host "hello - I'm a custom task written in Powershell!" Now, let’s run build.bat. You could get some PSake action going here. I won’t dive into that in this post though. Customize UppercuT Builds With Ruby If you want to customize any extension point with Ruby, just add .rb to the end of the file name and write your custom tasks in Ruby.  Let’s write a custom ruby task for UC. If you were thinking it would be the same as the one we just wrote for PS, you’d be right! In the build.custom folder, lets create a file called default.replace.build.rb. Open that file in notepad and let’s put this in there: puts "I'm a custom ruby task!" Now, let’s run build.bat again. That’s chunky bacon. UppercuT and Albacore.NET Just for fun, I wanted to see if I could replace the compile.step with a Rake task. Not just any rake task, Albacore’s msbuild task. Albacore is a suite of rake tasks brought about by Derick Bailey to make building .NET with Rake easier. It has quite a bit of support with developers that are using Rake to build code. In my build.custom folder, I drop a compile.replace.step.rb. I also put in a separate file that will contain my Albacore rake task and I call that compile.rb. What are the contents of compile.replace.step.rb? rake = 'rake' arguments= '-f ' + Dir.pwd + '/../build.custom/compile.rb' #puts "Calling #{rake} " + arguments system("#{rake} " + arguments) Since the custom extensions call ruby, we have to shell back out and call rake. That’s what we are doing here. We also realize that ruby is called from the build folder, so we need to back out and dive into the build.custom folder to find the file that is technically next to us. What are the contents of compile.rb? require 'rubygems' require 'fileutils' require 'albacore' task :default => [:compile] puts "Using Ruby to compile UppercuT with Albacore Tasks" desc 'Compile the source' msbuild :compile do |msb| msb.properties = { :configuration => :Release, :outputpath => '../../build_output/UppercuT' } msb.targets [:clean, :build] msb.verbosity = "quiet" msb.path_to_command = 'c:/Windows/Microsoft.NET/Framework/v3.5/MSBuild.exe' msb.solution = '../uppercut.sln' end We are using the msbuild task here. We change the output path to the build_output/UppercuT folder. The output path has “../../” because this is based on every project. We could grab the current directory and then point the task specifically to a folder if we have projects that are at different levels. We want the verbosity to be quiet so we set that as well. So what kind of output do you get for this? Let’s run build.bat custom_tasks_replace:      [echo] Running custom tasks instead of normal tasks if C:\code\uppercut\build\..\build.custom\compile.replace.step exists.      [exec] (in C:/code/uppercut/build)      [exec] Using Ruby to compile UppercuT with Albacore Tasks      [exec] Microsoft (R) Build Engine Version 3.5.30729.4926      [exec] [Microsoft .NET Framework, Version 2.0.50727.4927]      [exec] Copyright (C) Microsoft Corporation 2007. All rights reserved. If you think this is awesome, you’d be right!   With this knowledge you shall build.

    Read the article

  • Does it make sense to write a build scripts in C++?

    - by Klaim
    I'm using CMake to generate my projects IDE/makefiles, but I still need to call custom "scripts" to manipulate my compiled files or even generate code. In previous projects I've been using Python and it was OK, but now I'm having serious trouble managing a lot of dependencies in two very big projects I'm working on so I want to minimize the dependencies everywhere. Someone suggested to me to use C++ to write my build scripts instead of adding a language dependency just for that. The projects themeselves already use C++ so there are several advantages that I can see: to build the whole project, only a C++ compiler and CMake would be necessary, nothing else (all the other dependencies are C or C++); C++ type safety (when using modern C++) makes everything easier to get "correct"; it's also the language I know the better so I'm more at ease with it even if I'm able to write some good Python code; potential gain in execution speed (but i don't think it will really be perceptible); However, I think there might be some drawbacks and I'm not sure of the real impact as I didn't try yet: might be longer to write the code (that said I'm not sure because I'm efficient enough in C++ to write something that work quickly, so maybe for this system it wouldn't be so long to write) (compilation time shouldn't be a problem for this case); I must assume that all the text files I'll read as input are in UTF-8, I'm not sure it can be easilly checked at runtime in C++ and the language will not check it for you; libraries in C++ are harder to manage than in scripting languages; I lack experience and forsight so maybe I'm missing advantages and drawbacks. So the question is: does it make sense to use C++ for this? do you have experiences to report and do you see advantages and disadvantages that might be important?

    Read the article

  • How to build Visual Studio Setup projects (.vdproj) with TFS 2010 Build ?

    - by Vishal
    Out of the box, Team Foundation Server 2010 Build does not support building of setup projects (.vdproj). Although, you can modify DefaultTemplate.xaml or create your own in order to achieve this. I had to try bunch of different blog post's and finally got it working with a mixture of all those posts.   Since you don’t have to go through this pain again, I have uploaded the Template which you can use right away : https://skydrive.live.com/redir?resid=65B2671F6B93CDE9!310 Download and CheckIn this template into your source control. Modify your Build Definition to use this template. Unless you have CheckedIn the template, it wont show up in the template selection section in the process task of build definition. In your Visual Studio Solution Configuration Manager, make sure you specify to build the setup project also. You might get this warning in your build result: “The project file “*.vdproj” is not support by MSBuild and cannot be build. Hope it helps. Thanks, Vishal Mody Reference blog posts I had used: http://geekswithblogs.net/jakob/archive/2010/05/14/building-visual-studio-setup-projects-with-tfs-2010-team-build.aspx http://donovanbrown.com/post/I-need-to-build-a-project-that-is-not-supported-by-MSBuild.aspx http://lajak.wordpress.com/2011/02/19/build-biztalk-deployment-framework-projects-using-tfs2010/

    Read the article

  • Does it make sense to write build scripts in C++?

    - by Klaim
    I'm using CMake to generate my projects IDE/makefiles, but I still need to call custom "scripts" to manipulate my compiled files or even generate code. In previous projects I've been using Python and it was OK, but now I'm having serious trouble managing a lot of dependencies in two very big projects I'm working on so I want to minimize the dependencies everywhere. Someone suggested to me to use C++ to write my build scripts instead of adding a language dependency just for that. The projects themeselves already use C++ so there are several advantages that I can see: to build the whole project, only a C++ compiler and CMake would be necessary, nothing else (all the other dependencies are C or C++); C++ type safety (when using modern C++) makes everything easier to get "correct"; it's also the language I know the better so I'm more at ease with it even if I'm able to write some good Python code; potential gain in execution speed (but i don't think it will really be perceptible); However, I think there might be some drawbacks and I'm not sure of the real impact as I didn't try yet: might be longer to write the code (that said I'm not sure because I'm efficient enough in C++ to write something that work quickly, so maybe for this system it wouldn't be so long to write) (compilation time shouldn't be a problem for this case); I must assume that all the text files I'll read as input are in UTF-8, I'm not sure it can be easilly checked at runtime in C++ and the language will not check it for you; libraries in C++ are harder to manage than in scripting languages; I lack experience and forsight so maybe I'm missing advantages and drawbacks. So the question is: does it make sense to use C++ for this? do you have experiences to report and do you see advantages and disadvantages that might be important?

    Read the article

  • why i failed to build vsftp?

    - by hugemeow
    make, then failed with the following message. the main point is /lib/libcap.so.1: could not read symbols: File in wrong format, confusing... gcc -c readwrite.c -O2 -Wall -W -Wshadow -idirafter dummyinc gcc -c opts.c -O2 -Wall -W -Wshadow -idirafter dummyinc gcc -c ssl.c -O2 -Wall -W -Wshadow -idirafter dummyinc gcc -c sslslave.c -O2 -Wall -W -Wshadow -idirafter dummyinc gcc -c ptracesandbox.c -O2 -Wall -W -Wshadow -idirafter dummyinc gcc -c ftppolicy.c -O2 -Wall -W -Wshadow -idirafter dummyinc gcc -c sysutil.c -O2 -Wall -W -Wshadow -idirafter dummyinc gcc -c sysdeputil.c -O2 -Wall -W -Wshadow -idirafter dummyinc gcc -o vsftpd main.o utility.o prelogin.o ftpcmdio.o postlogin.o privsock.o tunables.o ftpdataio.o secbuf.o ls.o postprivparent.o logging.o str.o netstr.o sysstr.o strlist.o banner.o filestr.o parseconf.o secutil.o ascii.o oneprocess.o twoprocess.o privops.o standalone.o hash.o tcpwrap.o ipaddrparse.o access.o features.o readwrite.o opts.o ssl.o sslslave.o ptracesandbox.o ftppolicy.o sysutil.o sysdeputil.o -Wl,-s `./vsf_findlibs.sh` /lib/libcap.so.1: could not read symbols: File in wrong format collect2: ld returned 1 exit status make: *** [vsftpd] Error 1 [mirror@hugemeow vsftpd]$ ls /lib/libc libc-2.5.so libcap.so.1.10 libcidn.so.1 libcom_err.so.2.1 libcrypto.so.0.9.8e libcrypt.so.1 libcap.so.1 libcidn-2.5.so libcom_err.so.2 libcrypt-2.5.so libcrypto.so.6 libc.so.6 [mirror@hugemeow vsftpd]$ ls /lib/libc libc-2.5.so libcap.so.1.10 libcidn.so.1 libcom_err.so.2.1 libcrypto.so.0.9.8e libcrypt.so.1 libcap.so.1 libcidn-2.5.so libcom_err.so.2 libcrypt-2.5.so libcrypto.so.6 libc.so.6 [mirror@hugemeow vsftpd]$ ls /lib/libcap.so.1 -l lrwxrwxrwx 1 root root 14 Mar 20 2012 /lib/libcap.so.1 -> libcap.so.1.10 [mirror@hugemeow vsftpd]$ ls /lib/libcap.so.1 -lh lrwxrwxrwx 1 root root 14 Mar 20 2012 /lib/libcap.so.1 -> libcap.so.1.10 [mirror@hugemeow vsftpd]$ ls /lib/libcap.so.1 -lhL -rwxr-xr-x 1 root root 12K Mar 15 2007 /lib/libcap.so.1 this may have something to do with 64 bit system, but i have make modification to vsf_findlibs.sh 48 # Look for libcap (capabilities) 49 if locate_library /lib64/libcap.so.1; then 50 echo "/lib/libcap.so.1"; 51 elif locate_library /lib64/libcap.so.2; then 52 echo "/lib/libcap.so.2"; 53 else 54 # locate_library /usr/lib/libcap.so && echo "-lcap"; 55 # locate_library /lib/libcap.so && echo "-lcap"; 56 locate_library /lib64/libcap.so.1 && echo "-lcap"; 57 fi but make failed with the same error, why?

    Read the article

  • Cannot build digiKam

    - by Tichomir Mitkov
    I'm trying to compile digiKam 2.8.0. I have installed the required libraries but cMake seems to stuck without any meaningful reason. Here is the output of cMake: $ cmake -DCMAKE_BUILD_TYPE=relwithdebinfo -DCMAKE_INSTALL_PREFIX=/usr/local . -- Found Qt-Version 4.7.1 (using /usr/bin/qmake) -- Found X11: /usr/lib64/libX11.so -- Found KDE 4.6 include dir: /usr/include -- Found KDE 4.6 library dir: /usr/lib64 -- Found the KDE4 kconfig_compiler preprocessor: /usr/bin/kconfig_compiler -- Found automoc4: /usr/bin/automoc4 -- Local kdegraphics libraries will be compiled... YES -- Handbooks will be compiled..................... YES -- Extract translations files..................... NO -- Translations will be compiled.................. YES -- ---------------------------------------------------------------------------------- -- Starting CMake configuration for: libmediawiki ----------------------------------------------------------------------------- -- The following external packages were located on your system. -- This installation will have the extra features provided by these packages. ----------------------------------------------------------------------------- * QJSON - Qt library for handling JSON data ----------------------------------------------------------------------------- -- Congratulations! All external packages have been found. ----------------------------------------------------------------------------- -- ---------------------------------------------------------------------------------- -- Starting CMake configuration for: libkgeomap -- Found Qt-Version 4.7.1 (using /usr/bin/qmake) -- Found X11: /usr/lib64/libX11.so -- Check Kexiv2 library in local sub-folder... -- Found Kexiv2 library in local sub-folder: /home/tichomir/Downloads/digikam-2.8.0/extra/libkexiv2 -- kexiv2 found, the demo application will be compiled. -- ---------------------------------------------------------------------------------- -- Starting CMake configuration for: libkface -- Found Qt-Version 4.7.1 (using /usr/bin/qmake) -- Found X11: /usr/lib64/libX11.so -- First try at finding OpenCV... -- Great, found OpenCV on the first try. -- OpenCV Root directory is /usr/share/opencv -- External libface was not found, use internal version instead... -- ---------------------------------------------------------------------------------- -- Starting CMake configuration for: kipi-plugins -- Check Kexiv2 library in local sub-folder... -- Found Kexiv2 library in local sub-folder: /home/tichomir/Downloads/digikam-2.8.0/extra/libkexiv2 -- Check for Kdcraw library in local sub-folder... -- Found Kdcraw library in local sub-folder: /home/tichomir/Downloads/digikam-2.8.0/extra/libkdcraw CMake Error at extra/libkdcraw/cmake/modules/FindKdcraw.cmake:137 (file): file Internal CMake error when trying to open file: /home/tichomir/Downloads/digikam-2.8.0/extra/libkdcraw/libkdcraw/version.h for reading. Call Stack (most recent call first): extra/kipi-plugins/CMakeLists.txt:123 (FIND_PACKAGE) -- Check Kipi library in local sub-folder... -- Found Kipi library in local sub-folder: /home/tichomir/Downloads/digikam-2.8.0/extra/libkipi CMake Warning at extra/kipi-plugins/CMakeLists.txt:139 (MESSAGE): libkdcraw: Version information not found, your version is probably too old. -- Found GObject libraries: /usr/lib64/libgobject-2.0.so;/usr/lib64/libgmodule-2.0.so;/usr/lib64/libgthread-2.0.so;/usr/lib64/libglib-2.0.so -- Found GObject includes : /usr/include/glib-2.0/gobject -- Check for Ksane library in local sub-folder... -- Found Ksane library in local sub-folder: /home/tichomir/Downloads/digikam-2.8.0/extra/libksane -- Check for KGeoMap library in local sub-folder... -- Found KGeoMap library in local sub-folder: /home/tichomir/Downloads/digikam-2.8.0/extra/libkgeomap -- Check Mediawiki library in local sub-folder... -- Found Mediawiki library in local sub-folder: /home/tichomir/Downloads/digikam-2.8.0/extra/libmediawiki -- Check Vkontakte library in local sub-folder... -- Found Vkontakte library in local sub-folder: /home/tichomir/Downloads/digikam-2.8.0/extra/libkvkontakte -- Boost version: 1.38.0 -- libkgeomap: Found version 2.0.0 -- Found X11: /usr/lib64/libX11.so -- CMake version: cmake version 2.8.9 -- CMake version (cleaned): cmake version 2.8.9 -- -- ---------------------------------------------------------------------------------- -- kipi-plugins 2.8.0 dependencies results <http://www.digikam.org> -- -- libjpeg library found.................... YES -- libtiff library found.................... YES -- libpng library found..................... YES -- libkipi library found.................... YES -- libkexiv2 library found.................. YES -- libkdcraw library found.................. YES -- libxml2 library found.................... YES (optional) -- libxslt library found.................... YES (optional) -- libexpat library found................... YES (optional) -- native threads support library found..... YES (optional) -- libopengl library found.................. YES (optional) -- Qt4 OpenGL module found.................. YES -- libopencv library found.................. YES (optional) -- QJson library found...................... YES (optional) -- libgpod library found.................... YES (optional) -- Gdk library found........................ YES (optional) -- libkdepim library found.................. YES (optional) -- qca2 library found....................... YES (optional) -- libkgeomap library found................. YES (optional) -- libmediawiki library found............... YES (optional) -- libkvkontakte library found.............. YES (optional) -- boost library found...................... YES (optional) -- OpenMP library found..................... YES (optional) -- libX11 library found..................... YES (optional) -- libksane library found................... YES (optional) -- -- kipi-plugins will be compiled............ YES -- Shwup will be compiled................... YES (optional) -- YandexFotki will be compiled............. YES (optional) -- HtmlExport will be compiled.............. YES (optional) -- AdvancedSlideshow will be compiled....... YES (optional) -- ImageViewer will be compiled............. YES (optional) -- AcquireImages will be compiled........... YES (optional) -- DNGConverter will be compiled............ YES (optional) -- RemoveRedEyes will be compiled........... YES (optional) -- Debian Screenshots will be compiled...... YES (optional) -- Facebook will be compiled................ YES (optional) -- Imgur will be compiled................... YES (optional) -- VKontakte will be compiled............... YES (optional) -- IpodExport will be compiled.............. YES (optional) -- Calendar will be compiled................ YES (optional) -- GPSSync will be compiled................. YES (optional) -- Mediawiki will be compiled............... YES (optional) -- Panorama will be compiled................ YES (optional) -- ---------------------------------------------------------------------------------- -- -- ---------------------------------------------------------------------------------- -- Starting CMake configuration for: digiKam -- Check for Kdcraw library in local sub-folder... -- Found Kdcraw library in local sub-folder: /home/tichomir/Downloads/digikam-2.8.0/extra/libkdcraw CMake Error at extra/libkdcraw/cmake/modules/FindKdcraw.cmake:137 (file): file Internal CMake error when trying to open file: /home/tichomir/Downloads/digikam-2.8.0/extra/libkdcraw/libkdcraw/version.h for reading. Call Stack (most recent call first): core/CMakeLists.txt:156 (FIND_PACKAGE) -- Check Kexiv2 library in local sub-folder... -- Found Kexiv2 library in local sub-folder: /home/tichomir/Downloads/digikam-2.8.0/extra/libkexiv2 -- Check Kipi library in local sub-folder... -- Found Kipi library in local sub-folder: /home/tichomir/Downloads/digikam-2.8.0/extra/libkipi -- Check Kface library in local sub-folder... -- Found Kface library in local sub-folder: /home/tichomir/Downloads/digikam-2.8.0/extra/libkface -- Check for KGeoMap library in local sub-folder... -- Found KGeoMap library in local sub-folder: /home/tichomir/Downloads/digikam-2.8.0/extra/libkgeomap -- PGF_INCLUDE_DIRS = /usr/local/include/libpgf -- PGF_INCLUDEDIR = /usr/local/include/libpgf -- PGF_LIBRARIES = pgf -- PGF_LDFLAGS = -L/usr/local/lib;-lpgf -- PGF_CFLAGS = -I/usr/local/include/libpgf -- PGF_VERSION = 6.12.24 -- PGF_CODEC_VERSION_ID = 61224 -- Could NOT find any working clapack installation -- Boost version: 1.38.0 -- Check for LCMS1 availability... -- Found LCMS1: /usr/lib64/liblcms.so /usr/include -- Paralelized PGF codec disabled... -- Identified libjpeg version: 62 -- Found MySQL server executable at: /usr/sbin/mysqld -- Found MySQL install_db executable at: /usr/bin/mysql_install_db CMake Warning at core/CMakeLists.txt:310 (MESSAGE): libkdcraw: Version information not found, your version is probably too old. -- libkgeomap: Found version 2.0.0 -- Found gphoto2: -L/usr/lib64 -lgphoto2_port;-L/usr/lib64 -lgphoto2 -lgphoto2_port -lm -- WARNING: you are using the obsolete 'PKGCONFIG' macro, use FindPkgConfig -- WARNING: you are using the obsolete 'PKGCONFIG' macro, use FindPkgConfig -- PKGCONFIG() indicates that lqr-1 is not installed (install the package which contains lqr-1.pc if you want to support this feature) -- Could NOT find Lqr-1 (missing: LQR-1_INCLUDE_DIRS LQR-1_LIBRARIES) -- Found SharedDesktopOntologies: /usr/share/ontology -- Found SharedDesktopOntologies: /usr/share/ontology (found version "0.5.0", required is "0.2") -- -- ---------------------------------------------------------------------------------- -- digiKam 2.8.0 dependencies results <http://www.digikam.org> -- -- Qt4 SQL module found..................... YES -- MySQL Server found....................... YES -- MySQL install_db tool found.............. YES -- libtiff library found.................... YES -- libpng library found..................... YES -- libjasper library found.................. YES -- liblcms library found.................... YES -- Boost Graph library found................ YES -- libkipi library found.................... YES -- libkexiv2 library found.................. YES -- libkdcraw library found.................. YES -- libkface library found................... YES -- libkgeomap library found................. YES -- libpgf library found..................... YES (optional) -- libclapack library found................. NO (optional - internal version used instead) -- libgphoto2 and libusb libraries found.... YES (optional) -- libkdepimlibs library found.............. YES (optional) -- Nepomuk libraries found.................. YES (optional) -- libglib2 library found................... YES (optional) -- liblqr-1 library found................... NO (optional - internal version used instead) -- liblensfun library found................. YES (optional) -- Doxygen found............................ YES (optional) -- digiKam can be compiled.................. YES -- ---------------------------------------------------------------------------------- -- -- Adjusting compilation flags for GCC version ( 4.5.1 ) -- Configuring incomplete, errors occurred! Actually this line shows a sign of error CMake Error at extra/libkdcraw/cmake/modules/FindKdcraw.cmake:137 (file): file Internal CMake error when trying to open file: /home/tichomir/Downloads/digikam-2.8.0/extra/libkdcraw/libkdcraw/version.h for reading. 'version.h' doesn't exists instead there is a file 'version.h.cmake' I have installed libkdcraw (64-bit) from sources. I'm using OpenSuse

    Read the article

  • Manager Self Service at your Fingertips

    - by Elaine Clement
    Last week we released new and improved Manager Self Service capabilities in PeopleSoft HCM 9.1. We delivered a new Manager Dashboard, streamlined many Manager Self Service transactions, provided new Pivot Grid capabilities, and implemented one-click Related Actions accessible from multiple places – all with the goal of improving every Manager’s self service experience. Manager Dashboard These new capabilities have the potential to significantly impact an organization’s bottom line, and here is why. Increased Efficiency The Manager Dashboard provides a ‘one-stop shop’ for your Managers with all of the key data they need consolidated into a single view. Alerts notifying managers of important tasks are immediately viewable and actionable. Administrators can configure the dashboard to include the most important pagelets needed for their organization, and Managers can personalize it to fit within their personal way of conducting their tasks. The Related Actions feature further improves the ease with which Managers get their work done by providing one-click access to Manager Self Service transactions.  Increased Job Satisfaction The streamlined Manager transactions, related actions, and the new Manager Dashboard provide an enhanced user experience. Managers are able to quickly get in, get the information they need, complete their transactions, and get out. Managers can spend their time focusing on getting the business results they need instead of their day to day HR tasks. Enhanced Decision Support Administrators can ensure the information and analytics they want their Managers to use are available from the Manager Dashboard, establishing best business practices. Additional pivot grids relevant to your own organization can be added to the Manager Dashboard. With this easy access to the relevant information in an easily understood format, Managers can make the right business decisions needed to improve their team and their team’s productivity. For more details on the Manager Dashboard and some of the other newly posted features, such as a new Talent Summary, check out this video and others: Oracle PeopleSoft Webcasts

    Read the article

  • Build Controller status Unavailable issue in TFS2010

    - by jehan
    I ran into this problem few days back, I was not able to run the builds because the Build Controller was showing Status as Unavailable. It was showing the below exception: There was no endpoint listening at http://fullmachinename:9191/Build/v3.0/Services/Controller/2 that could accept the message. This is often caused by an incorrect address or SOAP action. See InnerException, if present, for more details. After trying out few things, I looked at below Build Service Properties.   Then, I did below modifications to the Build Services Properties: 1)      Changed the Local Build Service Endpoint(incoming) from http://machinename.domain.com:9191 to http://machinename:9191 2)      Changed the Connect to Team Project Collection (outgoing) from localhost to machine name. http://localhost:8080/tfs/defaultCollection to http://machinename:8080/tfs/DefaultCollection   After that Started the Build Services and it fixed the issue, the Build Controller was showing Available Status and was able to run the builds.

    Read the article

  • What exactly is the build number in MAJOR.MINOR.BUILDNUMBER.REVISION

    - by A9S6
    What I think about Build Numbers is that whenever a new nightly build is created, a new BUILDNUMBER is generated and assigned to that build. So for my 7.0 version application the nightly builds will be 7.0.1, 7.0.2 and so on. Is it so? Then what is the use of a REVISION after the build number? Or is the REVISION part being incremented after each nightly build? I am a little confused here... do we refer to each nightly build as a BUILD? The format is mentioned here: AssemblyVersion - MSDN

    Read the article

  • custom tabbars and make them circulate

    - by pengwang
    i want to custom tabbars and want to circulate slide,default tab bar only have 5 items show at the same time,it not meet me,i have 11 items,so i want to make 3 tabbars ,every have 5 items,for example A(0-4)--B(5-9)--C(10)--A--B--C--A. at print i only finish A(0-4)--B(5-9)--C(10),how to make them circulate? my code : .h file #import <UIKit/UIKit.h> @protocol InfiniTabBarDelegate; @interface InfiniTabBar : UIScrollView <UIScrollViewDelegate, UITabBarDelegate> { __weak id <InfiniTabBarDelegate> infiniTabBarDelegate; NSMutableArray *tabBars; UITabBar *aTabBar; UITabBar *bTabBar; } @property (nonatomic, weak) id infiniTabBarDelegate; @property (strong,nonatomic) NSMutableArray *tabBars; @property (strong,nonatomic) UITabBar *aTabBar; @property (strong,nonatomic) UITabBar *bTabBar; - (id)initWithItems:(NSArray *)items; - (void)setBounces:(BOOL)bounces; // Don't set more items than initially - (void)setItems:(NSArray *)items animated:(BOOL)animated; - (int)currentTabBarTag; - (int)selectedItemTag; - (BOOL)scrollToTabBarWithTag:(int)tag animated:(BOOL)animated; - (BOOL)selectItemWithTag:(int)tag; @end @protocol InfiniTabBarDelegate <NSObject> - (void)infiniTabBar:(InfiniTabBar *)tabBar didScrollToTabBarWithTag:(int)tag; - (void)infiniTabBar:(InfiniTabBar *)tabBar didSelectItemWithTag:(int)tag; @end .m file @implementation InfiniTabBar @synthesize infiniTabBarDelegate; @synthesize tabBars; @synthesize aTabBar; @synthesize bTabBar; - (id)initWithItems:(NSArray *)items { self = [super initWithFrame:CGRectMake(0.0, 411.0, 320.0, 49.0)]; // TODO: //self = [super initWithFrame:CGRectMake(self.superview.frame.origin.x + self.superview.frame.size.width - 320.0, self.superview.frame.origin.y + self.superview.frame.size.height - 49.0, 320.0, 49.0)]; // Doesn't work. self is nil at this point. if (self) { self.pagingEnabled = YES; self.delegate = self; self.tabBars = [[NSMutableArray alloc] init]; float x = 0.0; for (double d = 0; d < ceil(items.count / 5.0); d ++) { UITabBar *tabBar = [[UITabBar alloc] initWithFrame:CGRectMake(x, 0.0, 320.0, 49.0)]; tabBar.delegate = self; int len = 0; for (int i = d * 5; i < d * 5 + 5; i ++) if (i < items.count) len ++; tabBar.items = [items objectsAtIndexes:[NSIndexSet indexSetWithIndexesInRange:NSMakeRange(d * 5, len)]]; // NSLog(@"####%@",NSMakeRange(d * 5, len)); [self.tabBars addObject:tabBar]; [self addSubview:tabBar]; x += 320.0; } self.contentSize = CGSizeMake(x, 49.0); } return self; } - (void)setBounces:(BOOL)bounces { if (bounces) { int count = self.tabBars.count; if (count > 0) { if (self.aTabBar == nil) self.aTabBar = [[UITabBar alloc] initWithFrame:CGRectMake(-320.0, 0.0, 320.0, 49.0)]; [self addSubview:self.aTabBar]; if (self.bTabBar == nil) self.bTabBar = [[UITabBar alloc] initWithFrame:CGRectMake(count * 320.0, 0.0, 320.0, 49.0)]; [self addSubview:self.bTabBar]; } } else { [self.aTabBar removeFromSuperview]; [self.bTabBar removeFromSuperview]; } [super setBounces:bounces]; } - (void)setItems:(NSArray *)items animated:(BOOL)animated { for (UITabBar *tabBar in self.tabBars) { int len = 0; for (int i = [self.tabBars indexOfObject:tabBar] * 5; i < [self.tabBars indexOfObject:tabBar] * 5 + 5; i ++) if (i < items.count) len ++; [tabBar setItems:[items objectsAtIndexes:[NSIndexSet indexSetWithIndexesInRange:NSMakeRange([self.tabBars indexOfObject:tabBar] * 5, len)]] animated:animated]; } self.contentSize = CGSizeMake(ceil(items.count / 5.0) * 320.0, 49.0); } - (int)currentTabBarTag { return self.contentOffset.x / 320.0; } - (int)selectedItemTag { for (UITabBar *tabBar in self.tabBars) if (tabBar.selectedItem != nil) return tabBar.selectedItem.tag; // No item selected return 0; } - (BOOL)scrollToTabBarWithTag:(int)tag animated:(BOOL)animated { for (UITabBar *tabBar in self.tabBars) if ([self.tabBars indexOfObject:tabBar] == tag) { UITabBar *tabBar = [self.tabBars objectAtIndex:tag]; [self scrollRectToVisible:tabBar.frame animated:animated]; if (animated == NO) [self scrollViewDidEndDecelerating:self]; return YES; } return NO; } - (BOOL)selectItemWithTag:(int)tag { for (UITabBar *tabBar in self.tabBars) for (UITabBarItem *item in tabBar.items) if (item.tag == tag) { tabBar.selectedItem = item; [self tabBar:tabBar didSelectItem:item]; return YES; } return NO; } - (void)scrollViewDidEndDecelerating:(UIScrollView *)scrollView { [infiniTabBarDelegate infiniTabBar:self didScrollToTabBarWithTag:scrollView.contentOffset.x / 320.0]; } - (void)scrollViewDidEndScrollingAnimation:(UIScrollView *)scrollView { [self scrollViewDidEndDecelerating:scrollView]; } - (void)tabBar:(UITabBar *)cTabBar didSelectItem:(UITabBarItem *)item { // Act like a single tab bar for (UITabBar *tabBar in self.tabBars) if (tabBar != cTabBar) tabBar.selectedItem = nil; [infiniTabBarDelegate infiniTabBar:self didSelectItemWithTag:item.tag]; } @end

    Read the article

  • Self-signed certificates for a known community

    - by costlow
    Recently announced changes scheduled for Java 7 update 51 (January 2014) have established that the default security slider will require code signatures and the Permissions Manifest attribute. Code signatures are a common practice recommended in the industry because they help determine that the code your computer will run is the same code that the publisher created. This post is written to help users that need to use self-signed certificates without involving a public Certificate Authority. The role of self-signed certificates within a known community You may still use self-signed certificates within a known community. The difference between self-signed and purchased-from-CA is that your users must import your self-signed certificate to indicate that it is valid, whereas Certificate Authorities are already trusted by default. This works for known communities where people will trust that my certificate is mine, but does not scale widely where I cannot actually contact or know the systems that will need to trust my certificate. Public Certificate Authorities are widely trusted already because they abide by many different requirements and frequent checks. An example would be students in a university class sharing their public certificates on a mailing list or web page, employees publishing on the intranet, or a system administrator rolling certificates out to end-users. Managed machines help this because you can automate the rollout, but they are not required -- the major point simply that people will trust and import your certificate. How to distribute self-signed certificates for a known community There are several steps required to distribute a self-signed certificate to users so that they will properly trust it. These steps are: Creating a public/private key pair for signing. Exporting your public certificate for others Importing your certificate onto machines that should trust you Verify work on a different machine Creating a public/private key pair for signing Having a public/private key pair will give you the ability both to sign items yourself and issue a Certificate Signing Request (CSR) to a certificate authority. Create your public/private key pair by following the instructions for creating key pairs.Every Certificate Authority that I looked at provided similar instructions, but for the sake of cohesiveness I will include the commands that I used here: Generate the key pair.keytool -genkeypair -alias erikcostlow -keyalg EC -keysize 571 -validity 730 -keystore javakeystore_keepsecret.jks Provide a good password for this file. The alias "erikcostlow" is my name and therefore easy to remember. Substitute your name of something like "mykey." The sigalg of EC (Elliptical Curve) and keysize of 571 will give your key a good strong lifetime. All keys are set to expire. Two years or 730 days is a reasonable compromise between not-long-enough and too-long. Most public Certificate Authorities will sign something for one to five years. You will be placing your keys in javakeystore_keepsecret.jks -- this file will contain private keys and therefore should not be shared. If someone else gets these private keys, they can impersonate your signature. Please be cautious about automated cloud backup systems and private key stores. Answer all the questions. It is important to provide good answers because you will stick with them for the "-validity" days that you specified above.What is your first and last name?  [Unknown]:  First LastWhat is the name of your organizational unit?  [Unknown]:  Line of BusinessWhat is the name of your organization?  [Unknown]:  MyCompanyWhat is the name of your City or Locality?  [Unknown]:  City NameWhat is the name of your State or Province?  [Unknown]:  CAWhat is the two-letter country code for this unit?  [Unknown]:  USIs CN=First Last, OU=Line of Business, O=MyCompany, L=City, ST=CA, C=US correct?  [no]:  yesEnter key password for <erikcostlow>        (RETURN if same as keystore password): Verify your work:keytool -list -keystore javakeystore_keepsecret.jksYou should see your new key pair. Exporting your public certificate for others Public Key Infrastructure relies on two simple concepts: the public key may be made public and the private key must be private. By exporting your public certificate, you are able to share it with others who can then import the certificate to trust you. keytool -exportcert -keystore javakeystore_keepsecret.jks -alias erikcostlow -file erikcostlow.cer To verify this, you can open the .cer file by double-clicking it on most operating systems. It should show the information that you entered during the creation prompts. This is the file that you will share with others. They will use this certificate to prove that artifacts signed by this certificate came from you. If you do not manage machines directly, place the certificate file on an area that people within the known community should trust, such as an intranet page. Import the certificate onto machines that should trust you In order to trust the certificate, people within your known network must import your certificate into their keystores. The first step is to verify that the certificate is actually yours, which can be done through any band: email, phone, in-person, etc. Known networks can usually do this Determine the right keystore: For an individual user looking to trust another, the correct file is within that user’s directory.e.g. USER_HOME\AppData\LocalLow\Sun\Java\Deployment\security\trusted.certs For system-wide installations, Java’s Certificate Authorities are in JAVA_HOMEe.g. C:\Program Files\Java\jre8\lib\security\cacerts File paths for Mac and Linux are included in the link above. Follow the instructions to import the certificate into the keystore. keytool -importcert -keystore THEKEYSTOREFROMABOVE -alias erikcostlow -file erikcostlow.cer In this case, I am still using my name for the alias because it’s easy for me to remember. You may also use an alias of your company name. Scaling distribution of the import The easiest way to apply your certificate across many machines is to just push the .certs or cacerts file onto them. When doing this, watch out for any changes that people would have made to this file on their machines. Trusted.certs: When publishing into user directories, your file will overwrite any keys that the user has added since last update. CACerts: It is best to re-run the import command with each installation rather than just overwriting the file. If you just keep the same cacerts file between upgrades, you will overwrite any CAs that have been added or removed. By re-importing, you stay up to date with changes. Verify work on a different machine Verification is a way of checking on the client machine to ensure that it properly trusts signed artifacts after you have added your signing certificate. Many people have started using deployment rule sets. You can validate the deployment rule set by: Create and sign the deployment rule set on the computer that holds the private key. Copy the deployment rule set on to the different machine where you have imported the signing certificate. Verify that the Java Control Panel’s security tab shows your deployment rule set. Verifying an individual JAR file or multiple JAR files You can test a certificate chain by using the jarsigner command. jarsigner -verify filename.jar If the output does not say "jar verified" then run the following command to see why: jarsigner -verify -verbose -certs filename.jar Check the output for the term “CertPath not validated.”

    Read the article

  • What's the best Scala build system?

    - by gatoatigrado
    I've seen questions about IDE's here -- Which is the best IDE for Scala development? and What is the current state of tooling for Scala?, but I've had mixed experiences with IDEs. Right now, I'm using the Eclipse IDE with the automatic workspace refresh option, and KDE 4's Kate as my text editor. Here are some of the problems I'd like to solve: use my own editor IDEs are really geared at everyone using their components. I like Kate better, but the refresh system is very annoying (it doesn't use inotify, rather, maybe a 10s polling interval). The reason I don't use the built-in text editor is because broken auto-complete functionalities cause the IDE to hang for maybe 10s. rebuild only modified files The Eclipse build system is broken. It doesn't know when to rebuild classes. I find myself almost half of the time going to project-clean. Worse, it seems even after it has finished building my project, a few minutes later it will pop up with some bizarre error (edit - these errors appear to be things that were previously solved with a project clean, but then come back up...). Finally, setting "Preferences / Continue launch if project contains errors" to "prompt" seems to have no effect for Scala projects (i.e. it always launches even if there are errors). build customization I can use the "nightly" release, but I'll want to modify and use my own Scala builds, not the compiler that's built into the IDE's plugin. It would also be nice to pass [e.g.] -Xprint:jvm to the compiler (to print out lowered code). fast compiling Though Eclipse doesn't always build right, it does seem snappy -- even more so than fsc. I looked at Ant and Maven, though haven't employed either yet (I'll also need to spend time solving #3 and #4). I wanted to see if anyone has other suggestions before I spend time getting a suboptimal build system working. Thanks in advance! UPDATE - I'm now using Maven, passing a project as a compiler plugin to it. It seems fast enough; I'm not sure what kind of jar caching Maven does. A current repository for Scala 2.8.0 is available [link]. The archetypes are very cool, and cross-platform support seems very good. However, about compile issues, I'm not sure if fsc is actually fixed, or my project is stable enough (e.g. class names aren't changing) -- running it manually doesn't bother me as much. If you'd like to see an example, feel free to browse the pom.xml files I'm using [github]. UPDATE 2 - from benchmarks I've seen, Daniel Spiewak is right that buildr's faster than Maven (and, if one is doing incremental changes, Maven's 10 second latency gets annoying), so if one can craft a compatible build file, then it's probably worth it...

    Read the article

  • Adding Fake Build Information in TFS 2010

    - by Jakob Ehn
    We have been using TFS 2010 build for distributing a build in parallel on several agents, but where the actual compilation is done by a bunch of external tools and compilers, e.g. no MSBuild involved. We are using the ParallelTemplate.xaml template that Jim Lamb blogged about previously, which distributes each configuration to a different agent. We developed custom activities for running these external compilers and collecting the information and errors by reading standard out/error and pushing it back to the build log. But since we aren’t using MSBuild we don’t the get nice configuration summary section on the build summary page that we are used to. We would like to show the result of each configuration with any errors/warnings as usual, together with a link to the log file. TFS 2010 API to the rescue! What we need to do is adding information to the InformationNode structure that is associated with every TFS build. The log that you normally see in the Log view is built up as a tree structure of IBuildInformationNode objects. This structure can we accessed by using the InformationNodeConverters class. This class also contain some helper methods for creating BuildProjectNode, which contain the information about each project that was build, for example which configuration, number of errors and warnings and link to the log file. Here is a code snippet that first creates a “fake” build from scratch and the add two BuildProjectNodes, one for Debug|x86 and one for Release|x86 with some release information:   TfsTeamProjectCollection collection = TfsTeamProjectCollectionFactory.GetTeamProjectCollection(new Uri("http://lt-jakob2010:8080/tfs")); IBuildServer buildServer = collection.GetService<IBuildServer>(); var buildDef = buildServer.GetBuildDefinition("TeamProject", "BuildDefinition"); //Create fake build with random build number var detail = buildDef.CreateManualBuild(new Random().Next().ToString()); // Create Debug|x86 project summary IBuildProjectNode buildProjectNode = detail.Information.AddBuildProjectNode(DateTime.Now, "Debug", "MySolution.sln", "x86", "$/project/MySolution.sln", DateTime.Now, "Default"); buildProjectNode.CompilationErrors = 1; buildProjectNode.CompilationWarnings = 1; buildProjectNode.Node.Children.AddBuildError("Compilation", "File1.cs", 12, 5, "", "Syntax error", DateTime.Now); buildProjectNode.Node.Children.AddBuildWarning("File2.cs", 3, 1, "", "Some warning", DateTime.Now, "Compilation"); buildProjectNode.Node.Children.AddExternalLink("Log File", new Uri(@"\\server\share\logfiledebug.txt")); buildProjectNode.Save(); // Create Releaes|x86 project summary buildProjectNode = detail.Information.AddBuildProjectNode(DateTime.Now, "Release", "MySolution.sln", "x86", "$/project/MySolution.sln", DateTime.Now, "Default"); buildProjectNode.CompilationErrors = 0; buildProjectNode.CompilationWarnings = 0; buildProjectNode.Node.Children.AddExternalLink("Log File", new Uri(@"\\server\share\logfilerelease.txt")); buildProjectNode.Save(); detail.Information.Save(); detail.FinalizeStatus(BuildStatus.Failed); When running this code, it will a create a build that looks like this: As you can see, it created two configurations with error and warning information and a link to a log file. Just like a regular MSBuild would have done. This is very useful when using TFS 2010 Build in heterogeneous environments. It would also be possible to do this when running compilations completely outside TFS build, but then push the results of the into TFS for easy access. You can push all information, including the compilation summary, drop location, test results etc using the API.

    Read the article

  • Can't build gcc anymore since upgrade to 11.10

    - by Raphael R.
    On Monday I've upgraded to from Ubuntu 11.04 (my initial installation) to 11.10 and now I can't build gcc from source anymore. Since I forgot to uninstall the gcc package before the upgrade, Ubuntu replaced my 4.7.0 compiler with it's stable 4.6.1. So I tried to build the SVN sources again, but it fails. I've most recently tried it with SVN revision 180193. After some time, the build fails with the following message: /home/raphael/devel/gcc/build/./gcc/xgcc -B/home/raphael/devel/gcc/build/./gcc/ -B/usr/i686-pc-linux-gnu/bin/ -B/usr/i686-pc-linux-gnu/lib/ -isystem /usr/i686-pc-linux-gnu/include -isystem /usr/i686-pc-linux-gnu/sys-include -g -O2 -O2 -I. -I. -I../../src/gcc -I../../src/gcc/. -I../../src/gcc/../include -I../../src/gcc/../libdecnumber -I../../src/gcc/../libdecnumber/bid -I../libdecnumber -I../../src/gcc/../libgcc -g -O2 -DIN_GCC -W -Wall -Wwrite-strings -Wcast-qual -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -isystem ./include -fPIC -g -DHAVE_GTHR_DEFAULT -DIN_LIBGCC2 -fbuilding-libgcc -fno-stack-protector -I. -I. -I../.././gcc -I../../../src/libgcc -I../../../src/libgcc/. -I../../../src/libgcc/../gcc -I../../../src/libgcc/../include -I../../../src/libgcc/config/libbid -DENABLE_DECIMAL_BID_FORMAT -DHAVE_CC_TLS -DUSE_TLS -o _ashldi3.o -MT _ashldi3.o -MD -MP -MF _ashldi3.dep -DL_ashldi3 -c ../../../src/libgcc/../gcc/libgcc2.c \ -fvisibility=hidden -DHIDE_EXPORTS In file included from /usr/include/stdio.h:28:0, from ../../../src/libgcc/../gcc/tsystem.h:88, from ../../../src/libgcc/../gcc/libgcc2.c:29: /usr/include/features.h:323:26: fatal error: bits/predefs.h: File or directory not found. I've cofigured it with: ~/devel/gcc/build$ ../src/configure --prefix=/usr --enable-languages=c++ And make it with: ~/devel/gcc/build$ make -j4 Just to be sure, I did a rm -rf * in the build directory in case there's some broken stuff inside. Didn't help, though. That's the backstory. I tried to fix it and searched for the bits/predefs.h. It's inside /usr/include/i386-linux-gnu. I temporarily fixed the problem by doing ~/devel/gcc/build$ C_INCLUDE_PATH=/usr/include/i386-linux-gnu make -j4 Which is only temporary because now gcc complains that it can't find crti.o. Which i can find in /usr/lib/i386-linux-gnu. Now i could also set C_LIBRARY_PATH - actually it doesn't work - but I feel like I'm fighting the system here. Also, even if it succeeds, my newly built compiler would also not know about the i386-linux-gnu stuff. So I would have to set C_LIBRARY_PATH and C_INCLUDE_PATH before every build of every project I have. I could add it to my .bashrc but that subverts the system even more. So, how do I tell the build process: That there are additional include/lib directories, and That it should build a gcc which respects them too? Edit: I forgot to include the command which causes the above error message. Also I can think of another solution: Copy the stuff from /usr/include/i386-linux-gnu to /usr/include (same thing for /usr/lib/i386-linux-gnu to /usr/lib). But that doesn't feel right, either. Finally, the system's gcc 4.6.1 can compile other applications just fine, except mine, which use C++11 features not present in the 4.6 series.

    Read the article

  • Python: nonblocking read from stdout of threaded subprocess

    - by sberry2A
    I have a script (worker.py) that prints unbuffered output in the form... 1 2 3 . . . n where n is some constant number of iterations a loop in this script will make. In another script (service_controller.py) I start a number of threads, each of which starts a subprocess using subprocess.Popen(stdout=subprocess.PIPE, ...); Now, in my main thread (service_controller.py) I want to read the output of each thread's worker.py subprocess and use it to calculate an estimate for the time remaining till completion. I have all of the logic working that reads the stdout from worker.py and determines the last printed number. The problem is that I can not figure out how to do this in a non-blocking way. If I read a constant bufsize then each read will end up waiting for the same data from each of the workers. I have tried numerous ways including using fcntl, select + os.read, etc. What is my best option here? I can post my source if needed, but I figured the explanation describes the problem well enough. Thanks for any help here. EDIT Adding sample code I have a worker that starts a subprocess. class WorkerThread(threading.Thread): def __init__(self): self.completed = 0 self.process = None self.lock = threading.RLock() threading.Thread.__init__(self) def run(self): cmd = ["/path/to/script", "arg1", "arg2"] self.process = subprocess.Popen(cmd, stdout=subprocess.PIPE, bufsize=1, shell=False) #flags = fcntl.fcntl(self.process.stdout, fcntl.F_GETFL) #fcntl.fcntl(self.process.stdout.fileno(), fcntl.F_SETFL, flags | os.O_NONBLOCK) def get_completed(self): self.lock.acquire(); fd = select.select([self.process.stdout.fileno()], [], [], 5)[0] if fd: self.data += os.read(fd, 1) try: self.completed = int(self.data.split("\n")[-2]) except IndexError: pass self.lock.release() return self.completed I then have a ThreadManager. class ThreadManager(): def __init__(self): self.pool = [] self.running = [] self.lock = threading.Lock() def clean_pool(self, pool): for worker in [x for x in pool is not x.isAlive()]: worker.join() pool.remove(worker) del worker return pool def run(self, concurrent=5): while len(self.running) + len(self.pool) > 0: self.clean_pool(self.running) n = min(max(concurrent - len(self.running), 0), len(self.pool)) if n > 0: for worker in self.pool[0:n]: worker.start() self.running.extend(self.pool[0:n]) del self.pool[0:n] time.sleep(.01) for worker in self.running + self.pool: worker.join() and some code to run it. threadManager = ThreadManager() for i in xrange(0, 5): threadManager.pool.append(WorkerThread()) threadManager.run() I have stripped out a log of the other code in hopes to try to pinpoint the issue.

    Read the article

  • InstallShield 2010 with license - no license for automatic build system (CI) as Windows service

    - by Gilad
    I really need help here. We are using CI build-process (Hudson) as an automated build system using Msbuild. The CI run in Apache Tomcat 6 that run under the credentials of a domain user (not a local Windows user ). Every time the CI try to build an InstallShield project (using isproj files) we get a license error message: " C:\Program Files\MSBuild\InstallShield\2010\InstallShield.targets(62,3): error : -7159: The product license has expired or has not yet been initialized. You must launch the IDE to configure the product license in order to proceed. C:\Program Files\MSBuild\InstallShield\2010\InstallShield.targets(62,3): error : Exception Caught". If I log in to the same machine with the same domain user credentials and build the InstallShield project there is a license and it is working well. Adding the user to the local Users group doesn't help (no license). Adding the user to the local Administrators group helps and it is working. We do not want the user to be in the local Administrators group - for various reasons. What do I need to do to make it work? Do I need to add permissions to the use? Help will be highly appreciated. Gilad

    Read the article

  • Build System with Recursive Dependency Aggregation

    - by radman
    Hi, I recently began setting up my own library and projects using a cross platform build system (generates make files, visual studio solutions/projects etc on demand) and I have run into a problem that has likely been solved already. The issue that I have run into is this: When an application has a dependency that also has dependencies then the application being linked must link the dependency and also all of its sub-dependencies. This proceeds in a recursive fashion e.g. (For arguments sake lets assume that we are dealing exclusively with static libraries.) TopLevelApp.exe dependency_A dependency_A-1 dependency_A-2 dependency_B dependency_B-1 dependency_B-2 So in this example TopLevelApp will need to link dependency_A, dependency_A-1, dependency_A-2 etc and the same for B. I think the responsibility of remembering all of these manually in the target application is pretty sub optimal. There is also the issue of ensuring the same version of the dependency is used across all targets (assuming that some targets depend on the same things, e.g. boost). Now linking all of the libraries is required and there is no way of getting around it. What I am looking for is a build system that manages this for you. So all you have to do is specify that you depend on something and the appropriate dependencies of that library will be pulled in automatically. The build system I have been looking at is premake premake4 which doesn't handle this (as far as I can determine). Does anyone know of a build system that does handle this? and if there isn't then why not?

    Read the article

  • Sometimes Xcode appears to ignore target build settings?

    - by Derek Clarkson
    Hi all, I've created a iPhone static library project with two targets like this Project -- Library (Device) target -- Library (simulator) target The device target has the SDK set to the device so it produces an armv6/7 library and the simulator target is set to the simulator SDK so it produces an i386 library. The issue I'm having is that the SDK settings on the targets keep getting overridden by the XCode active target setting. i.e. if I build the device target, but the XCode window is showing the active SDK as being the simlulator, XCode will build a simulator library instead of a device library, ignoring the settings of the target. Although it will put it into the *-iphoneos/ directory in the build directories! I originally had the same issue with another static library project, and after a lot of playing around got everything to work correctly. i.e. The targets ignore the XCode active SDK because they have their own specifications of what to build. The problem is that I don't know what made it work in that project and I have not been able to reproduce the issue in it either. Does anyone have any ideas as to what is going on? ciao Derek

    Read the article

  • Problems referencing build output from TFS Build and Visual Studio

    - by pmdarrow
    Here's what I'm trying to do: I have two solutions - one for my main application and its associated projects and another for my database (VS .dbproj) and its associated projects. What I'd like to do is include the output from the database project (a .dbschema and some SQL scripts) in my WiX installer (which exists in the main application solution.) This involves having TFS build the DB solution immediately before the main application solution. I've got that part working properly, but I'm having trouble referencing the output of the DB solution from my installer. I'm using relative paths to reference the DB project output in my WiX installer (e.g. <?define DBProjectOutputDir = "..\..\MyDatabaseSolution\MyDatabaseProject\sql\"?>) which works fine locally, but fails when building via TFS build. This is because TFS Build apparently changes the output dir of each project to one common location. Instead of the path to my database project being ..\..\MyDatabaseSolution\MyDatabaseProject\sql\ like it is when building locally, it gets set to something like ..\..\..\Binaries\Release\. How can I get around this and have a consistent output location to reference from my installer project? I'm using TFS 2005 and VS 2008.

    Read the article

  • Build number increment not reflected in AssemblyVersion

    - by awshepard
    I've browsed through some of the discussion on auto-incrementing build numbers, but in the impatience of youth decided to roll my own and re-invent the wheel. I know there are probably better ways to go about this (which I'm definitely going to investigate), but my question centers more around the Assembly and/or Version classes. My approach was to write a separate exe (BuildIncrementer) that takes a command line parameter for file name, does a regex match on the contents to grab the [assembly: AssemblyVersion...] string, do the modifications that I want (increment the build number, etc.), then write the contents back to the file. This approach works as-is. The next thing I did was in the project that I wanted to use this on, I set up a pre-build command line that is simply the command to execute that BuildIncrementer.exe on this project's AssemblyInfo.cs file. This too works, updating the assembly info as desired. The problem comes when I run the project, it sends an email containing the current version, obtained with Assembly.GetExecutingAssembly().GetName().Version.ToString(). BUT, the version showing up is the previous version. When my AssemblyInfo.cs says [assembly: AssemblyVersion("1.0.2.49667")], I get sent 1.0.1.45660, which was the previous build. Anyone have any ideas why that might be?

    Read the article

  • C# Toolbox: Debug-able, Self-Installable Windows Service Template Redux

    - by James Michael Hare
    I had written a pair of posts before about creating a debug-able and self-installing windows service template in C#.  This is a template I began creating to ease creating windows services and to take some of the mundane tasks out of the coding effort.  The original posts were here: C# Windows Services (1 of 2) - Debug-able Windows Services C# Windows Services (2 of 2) - Self-Installing Windows Services But at the time, though I gave the code samples I didn't have a downloadable for of the template on the blog.  After getting many requests for the actual source, I zipped it up and am posting it with this blog entry.  Click on the link below to download the archive.  The password on the archive is, imaginatively enough, password.  Hope you enjoy and please feel free to comment and suggest changes! Debug-able, Self-Installing Windows Service Template download Enjoy! Tweet Technorati Tags: C#,Windows Service,Toolbox

    Read the article

  • Stored Procedures In Source Control - Automate Build/Deployment Process

    - by Alex
    My company provides a large .NET service-oriented solution. The services layer interact with a T-SQL back-end consisting of hundreds of tables and stored procedures. Our C# code is in version-control (SVN) but our stored procedures and schema are not. After much lobbying of expedient upper-management, I was allowed to review our (non-existent) build/deployment process to accomplish the following goals: Place schema and stored procedures under source-control. Automate the build/deployment process. I would like to proceed per the accepted answer's strategy in this post but have additional questions: I would like to use Hudson as my build server. Is this a reasonable choice for a C#/SQL solution? What better alternatives should I explore? Assuming I have all triggers, stored-procedures, schema, etc... under source control, and that they are scripted to individual files, how do I generate a build script which will take into account dependencies/references between these items? (SQL Server does this automatically, but it generates one giant script) What does the workflow of performing an update at the client look like? i.e. I have to keep existing table data. How do I roll-back schema changes? I am the only programmer. Several other pseudo-technical staff like to make changes directly inside SQL Management Studio. Is it realistic to expect others to adhere to this solution -- how can I enforce this? Thank you in advance for your help.

    Read the article

< Previous Page | 12 13 14 15 16 17 18 19 20 21 22 23  | Next Page >