Search Results

Search found 27654 results on 1107 pages for 'build management'.

Page 18/1107 | < Previous Page | 14 15 16 17 18 19 20 21 22 23 24 25  | Next Page >

  • Thoughts on Build 2013

    - by D'Arcy Lussier
    Originally posted on: http://geekswithblogs.net/dlussier/archive/2013/06/30/153294.aspxAnd so another Build conference has come to an end. Below are my thoughts/perspectives on various aspects of the event. I’ll do a separate blog post on my thoughts of the Build message for developers. The Good Moscone center was a great venue for Build! Easy to get around, easy to get to, and well maintained, it was a very comfortable conference venue. Yeah, the free swag was nice. Build has built up an expectation that attendees will always get something; it’ll be interesting to see how Microsoft maintains this expectation over the next few Build events. I still maintain that free swag should never be the main reason one attends an event, and for me this was definitely just an added bonus. I’m planning on trying to use the Surface as a dedicated 2nd device at work for meetings, I’ll share my experiences over the next few months. The hackathon event was a great idea, although personally I couldn’t justify spending the money on a conference registration just to spend the entire conference coding. Still, the apps that were created were really great and there was a lot of passion and excitement around the hackathon. I wonder if they couldn’t have had the hackathon on the Monday/Tuesday for those that wanted to participate so they didn’t miss any of the actual conference over Wed/Thurs. San Francisco was a great city to host Build. Getting from hotels to the conference center was very easy (well especially for me, I was only 3 blocks away) and the city itself felt very safe. However, if I never have to fly into SFO again I’ll be alright with that! Delays going into and out of SFO and both apparently were due to the airport itself. The Bad Build is one of those oddities on the conference landscape where people will pay to commit to attending an event without knowing anything about the sessions. We got our list of conference sessions when we registered on Tuesday, not before. And even then, we only got titles and not descriptions (those were eventually made available via the conference’s mobile application). I get it…they’re going to make announcements and they don’t want to give anything away through the session titles. But honestly, there wasn’t anything in the session titles that I would have considered a surprise. Breakfasts were brutal. High-carb pastries, donuts, and muffins with fruit and hard boiled eggs does not a conference breakfast make. I can’t believe that the difference between a continental breakfast per person and a hot breakfast buffet would have been a huge impact to a conference fee that was already around $2000. The vendor area was anemic. I don’t know why Microsoft forces the vendors into cookie-cutter booth areas (this year they were all made of plywood material). WPC, TechEd – booth areas there allow the vendors to be creative with their displays. Not so much for Build. Really odd was the lack of Microsoft’s own representation around Bing. In the day 1 keynote Microsoft made a big deal about Bing as an API. Yet there was nobody in the vendor area set up to provide more information or have discussions with about the Bing API. The Ugly Our name badges were NFC enabled. The purpose of this, beyond the vendors being able to scan your info, wasn’t really made clear. An attendee I talked to showed how you could get a reader app on your phone so you can scan other members cards and collect their contact info – which is a kewl idea; business cards are so 1990’s. But I was *shocked* at the amount of information that was on our name badges! Here’s what’s displayed on our name badge: - Name - Company - Twitter Handle I’m ok with that. But here’s what actually gets read: - Name - Company - Address Used for Registration - Phone Number Used for Registration So sharing that info with another attendee, they get way more of my info than just how to find me on Twitter! Microsoft, you need to fix this for the future. If vendors want to collect information on attendees, they should be able to collect an ID from the badge, then get a report with corresponding records afterwards. My personal information should not be so readily available, and without my knowledge! Final Verdict Maybe its my older age, maybe its where I’m at in life with family, maybe its where I’m at in my career, but when I consider whether a conference experience was valuable I get to the core reason I attend: opportunities to learn, opportunities to network, opportunities to engage with Microsoft. Opportunities to Learn:  Sessions I attended were generally OK, with some really stand out ones on Day 2. I would love to see Microsoft adopt the Dojo format for a portion of their sessions. Hands On Labs are dull, lecture style sessions are great for information sharing. But a guided hands-on coding session (Read: Dojo) provides the best of both worlds. Being that all content is publically available online to everyone (Build attendee or not), the value of attending the conference sessions is decreased. The value though is in the discussions that take part in person afterwards, which leads to… Opportunities to Network: I enjoyed getting together with old friends and connecting with Twitter friends in person for the first time. I also had an opportunity to meet total strangers. So from a networking perspective, Build was fantastic! I still think it would have been great to have an area for ad-hoc discussions – where speakers could announce they’d be available for more questions after their sessions, or attendees who wanted to discuss more in depth on a topic with other attendees could arrange space. Some people have no problems being outgoing and making these things happen, but others are not and a structured model is more attractive. Opportunities to Engage with Microsoft: Hit and miss on this one. Outside of the vendor area, unless you cornered or reached out to a speaker, there wasn’t any defined way to connect with blue badges. And as I mentioned above, Microsoft didn’t have full representation in the vendor area (no Bing). All in all, Build was a fun party where I was informed about some new stuff and got some free swag. Was it worth the time away from home and the hit to my PD budget? I’d say Somewhat. Build is a great informational conference, but I wouldn’t call it a learning conference. Considering that TechEd seems to be moving to more of an IT Pro focus, independent developer conferences seem to be the best value for those looking to learn and not just be informed. With the rapid development cycle Microsoft is embracing, we’re already seeing Build happening twice within a 12 month period. If that continues, the value of attending Build in person starts to diminish – especially with so much content available online. If Microsoft wants Build to be a must-attend event in the future, they need to start incorporating aspects of Tech Ed, past PDCs, and other conferences so those that want to leave with more than free swag have something to attract them.

    Read the article

  • Parallel MSBuild FTW - Build faster in parallel

    - by deadlydog
    Hey everyone, I just discovered this great post yesterday that shows how to have msbuild build projects in parallel Basically all you need to do is pass the switches “/m:[NumOfCPUsToUse] /p:BuildInParallel=true” into MSBuild. Example to use 4 cores/processes (If you just pass in “/m” it will use all CPU cores): MSBuild /m:4 /p:BuildInParallel=true "C:\dev\Client.sln" Obviously this trick will only be useful on PCs with multi-core CPUs (which we should all have by now) and solutions with multiple projects; So there’s no point using it for solutions that only contain one project.  Also, testing shows that using multiple processes does not speed up Team Foundation Database deployments either in case you’re curious Also, I found that if I didn’t explicitly use “/p:BuildInParallel=true” I would get many build errors (even though the MSDN documentation says that it is true by default). The poster boasts compile time improvements up to 59%, but the performance boost you see will vary depending on the solution and its project dependencies.  I tested with building a solution at my office, and here are my results (runs are in seconds): # of Processes 1st Run 2nd Run 3rd Run Avg Performance 1 192 195 200 195.67 100% 2 155 156 156 155.67 79.56% 4 146 149 146 147.00 75.13% 8 136 136 138 136.67 69.85%   So I updated all of our build scripts to build using 2 cores (~20% speed boost), since that gives us the biggest bang for our buck on our solution without bogging down a machine, and developers may sometimes compile more than 1 solution at a time.  I’ve put the any-PC-safe batch script code at the bottom of this post. The poster also has a follow-up post showing how to add a button and keyboard shortcut to the Visual Studio IDE to have VS build in parallel as well (so you don’t have to use a build script); if you do this make sure you use the .Net 4.0 MSBuild, not the 3.5 one that he shows in the screenshot.  While this did work for me, I found it left an MSBuild.exe process always hanging around afterwards for some reason, so watch out (batch file doesn’t have this problem though).  Also, you do get build output, but it may not be the same that you’re used to, and it doesn’t say “Build succeeded” in the status bar when completed, so I chose to not make this my default Visual Studio build option, but you may still want to. Happy building! ------------------------------------------------------------------------------------- :: Calculate how many Processes to use to do the build. SET NumberOfProcessesToUseForBuild=1  SET BuildInParallel=false if %NUMBER_OF_PROCESSORS% GTR 2 (                 SET NumberOfProcessesToUseForBuild=2                 SET BuildInParallel=true ) MSBuild /maxcpucount:%NumberOfProcessesToUseForBuild% /p:BuildInParallel=%BuildInParallel% "C:\dev\Client.sln"

    Read the article

  • Remote management interface for managing ip6tables (or an alternative firewall)

    - by Matthew Iselin
    I'm working with IPv6 and have run into an issue configuring ip6tables on our main router in order to control what can come into the network. A default DROP rule in the FORWARD section has worked well (obviously leaving ESTABLISHED,RELATED as ACCEPT) to keep internal clients' open ports from being accessed. However, running an ip6tables command for every little change is unwieldy. Whilst we are able to continue creating rules manually, I'm wondering if there's some sort of management interface we could use to create the rules quickly and easily. We're looking to be able to save time working on our firewall as well as providing a simple method for modifying rules for those who will eventually replace us. I know webmin (heavily locked down on our network, naturally) has support for modifying iptables rules, but seemingly no support for ip6tables. Something similar would be fantastic. Alternatively, suggestions for a firewall solution apart from iptables/ip6tables which can be managed remotely wouldn't be out of order. A web interface for management is certainly preferable, even if it is just a wrapper with shiny buttons over the raw config files.

    Read the article

  • management network to a network port for additional ones munin and monit

    - by paolo
    management network to a network port for additional ones munin and monit I want to build a separate Netzwek for server management. I have several network cards a linux / debian / ubuntu with computer. Set both network cards sin in the /etc/network/interfaces. # The primary network interface #allow-hotplug eth0 #iface eth0 inet dhcp auto eth0 iface eth0 inet static address 10.0.0.240 netmast 255.255.255.0 network 10.0.0.0 brodacast 10.0.0.255 gateway 10.0.0.254 auto eth1 iface eth1 inet static address 10.0.10.240 netmast 255.255.255.0 network 10.0.10.0 brodacast 10.0.10.255 post-up ip route add 10.0.0.0/24 dev eth0 src 10.0.0.240 table eth0-WAN post-up ip route add default via 10.0.0.254 table eth0-WAN post-up ip route add 10.0.10.0/24 dev eth1 src 10.0.10.240 table eth1-LAN post-up ip route add default via 10.0.10.200 table eth1-LAN post-up ip rule add from 10.0.0.240 table eth0-WAN post-up ip rule add from 10.0.10.240 table eth1-LAN still i adjusted / etc/iproute2/rt_tables and following routes set up in the /etc/network/interfaces I want to have both applications and the network interface separately as munin and monit only on eth1 and not have to eth0. it goes to the reboot but sometimes not always. # Traceroute-i eth1 10.0.10.200 not go what am I doing wrong?

    Read the article

  • Web based KVM management for Ubuntu

    - by Tim
    We've got a single Ubuntu 9.10 root server on which we want to run multiple KVM virtual machines. To administer these virtual machines I'd like a web based KVM management tool, but I don't know which one to choose from the list of tools mentioned on linux-kvm.org. I've used virsh & virt-manager on my desktop, but would like a web interface for the server. I tested ConVirt on my desktop, but it failed to pickup KVM machines from virsh / virt-manager, and I could not get KVM virtual machine import to work (only Xen). oVirt looks good, but I can't find out if and how I can install it on Ubuntu 9.10.. (And I'd really rather not waste another few days on testing stuff that might not work in the end.) Can anyone recommend any good web based KVM management tools that are easy to install on Ubuntu 9.10? I'm looking for something that will also allow me to run other services like apache and postgresql besides hosting virtual machines, so preferably fairly lightweight & no dedicated OS installs. We don't need any professional clustering / migration or anything, just something that will let us create, start, inspect, administer & stop virtual machines from a web page. Best regards, Tim Update: Anyone have any suggestions? It's awfully quiet here..

    Read the article

  • Eclipse antRunner command line build has wrong dependency build order

    - by Sam Jones
    My team's Eclipse RCP app builds fine from the Eclipse IDE. When I try to build it from the command line, using the antRunner application, it builds the plugins out of order -- a plugin builds before it's dependencies are built, and so can't resolve some of the needed classes. Where should I look to fix this? As far as I can tell, the dependencies are set up correctly (a feature that depends on a plugin that depends on another plugin). I have my build.properties and folder structure set up as specified here. Is there anything else I should look at?

    Read the article

  • Jenkins—get "Build Time Trend" values using "Remote Access API"

    - by Chathura Kulasinghe
    Is there a way that we can get all Jenkins-"Build Time Trend" information ( Build number + Status[success/failed etc] + Duration ) for an application; using the Jenkins remote access API? Or else I would appreciate if you could post a link of any documentation on how to get information from Jenkins using the Remote Access API. Most of the sources consist of the way of running jobs, but I couldn't find any, which shows how to fetch information from jenkins. Thanks!

    Read the article

  • Very odd build problem

    - by user144182
    When I build my solution, it complains about a missing referenced DLL. When I rebuild it, the problem goes away. Whenever I do a clean this returns, i.e. have to attempt a build twice before it succeeds. This is vague, but if warranted I can give a better explanation of solution structure.

    Read the article

  • Build Specific .cpps

    - by colordot
    Can someone please point out where in the Visual Studio 2008 configurations I can set which .cpp files are used for a given build. I'd like to use a different set(s) of .cpp files when I am doing a release vs.debug build. Thanks!

    Read the article

  • Prevent deploying debug build with ClickOnce

    - by jomi
    Hi, I'm publishing a ClickOnce application with VS2008, but before every publish I have to switch to Release config manually. This is fine as far as I don't forget to switch. Is there a way to prevent deploying debug builds ? Is there some compiler directive like: #if DEBUG #if ClickOnce #error You cannot publish a debug build #endif #endif Or is there a way (without build scripts) to automatically switch to Release config before publishing ? (I've found some similar questions but didn't like the anwsers on them) Thanks

    Read the article

  • how to add revision and build date to source

    - by gucki
    Hi! I have a gcc project and would like to automatically add defines for build date and revision number (from git) to my sources. What's the best way to do this? My goal is simple to be able to do something like this on startup: printf("Test app build on %s, revision %d", BUILD_DATE, REVISION) For building I'm using make with a simple Makefile.inc, not autoconf ot anything like this. Thanks, Corin

    Read the article

  • Visual Studio 2010's Build Command: It does nothing

    - by jasonh
    I'm using Visual C# 2010 Express RTM with Windows Phone Developer Tools April CTP Refresh and when I run any Build option, nothing happens. I've deleted the contents of the build output folders and that doesn't do anything. I can't even run the project, because it complains the executable is missing (XNA Game for Windows project). I've tried the project on another computer and it builds just fine. Any ideas?

    Read the article

  • BeginInvoke not invoking the target method in Release build

    - by Elan
    I have a method, which I wish to execute on the UI message pump and thus do the following: private void SomeMethod() { BeginInvoke(new MethodInvoker(MethodToInvoke)); } private void MethodToInvoke() { // This method contains code that I wish to execute on UI message pump. } Now, the above works just fine when I create a Debug build of the project. However, when I create a Release build, the "MethodToInvoke" method does not get invoked. Does anyone have any idea why this might be? Thanks, Elan

    Read the article

  • Including Build Date strings in a C# project

    - by David Rutten
    I'd like to hard code the build date into my application (DD-mmmm-YYYY), but how do I embed this constant into the code? I thought perhaps I could make a pre-build event that executes a *.bat file that updates a textfile which is resourced, but it sounds pretty involved. What's the best approach?

    Read the article

  • symfony 1.4: doctrine build model warning

    - by tigerstyle
    Hi volks, I copied my sources from my lokal dev (everything works fine) to my repository and from there I did a checkout on my remote dev. Now when I try to build everything I get this error: devel:/var/www/myproject# ./symfony doc:build-model doctrine generating model classes file+ /tmp/doctrine_schema_48726.yml Warning: file_get_contents(/var/www/myproject/lib/model/doctrine//base/BaseAdvert.class.php): failed to open stream: No such file or directory in /var/www/myproject/lib/vendor/symfony/lib/plugins/sfDoctrinePlugin/lib/task/sfDoctrineBuildModelTask.class.php on line 77 Do you know what the problem could be? Thx for your answers :)

    Read the article

  • Team Build Errors when there aren't any

    - by Jonesie
    I have a nightly team build that is reporting errors from the test step but zero errors in the summary. This results in a partial success. I cant see any errors in the full build log but maybe it's just the quantity of warnings?? Anyone got any ideas? Thanks

    Read the article

< Previous Page | 14 15 16 17 18 19 20 21 22 23 24 25  | Next Page >