Search Results

Search found 18014 results on 721 pages for 'build automation'.

Page 27/721 | < Previous Page | 23 24 25 26 27 28 29 30 31 32 33 34  | Next Page >

  • understanding a Build c++

    - by numerical25
    I think I know what a build is. But I am not sure. My definition of a build is another word for saying compiled application. Can someone please tell me what exactly a build is. And why do people ask for 2 types of builds. Such as Debug Build, Profile Build and a Release Build. What are the differences.

    Read the article

  • First PC Build (Part 1)

    - by Anthony Trudeau
    Originally posted on: http://geekswithblogs.net/tonyt/archive/2014/08/05/157959.aspxA couple of months ago I made the decision to build myself a new computer. The intended use is gaming and for using the last real version of Photoshop. I was motivated by the poor state of console gaming and a simple desire to do something I haven’t done before – build a PC from the ground up. I’ve been using PCs for more than two decades. I’ve replaced a component hear and there, but for the last 10 years or so I’ve only used laptops. Therefore, this article will be written from the perspective of someone familiar with PCs, but completely new at building. I’m not an expert and this is not a definitive guide for building a PC, but I do hope that it encourages you to try it yourself. Component List Research There was a lot of research necessary, because building a PC is completely new to me, and I haven’t kept up with what’s out there. The first thing you want to do is nail down what your goals are. Your goals are going to be driven by what you want to do with your computer and personal choice. Don’t neglect the second one, because if you’re doing this for fun you want to get what you want. In my case, I focused on three things: performance, longevity, and aesthetics. The performance aspect is important for gaming and Photoshop. This will drive what components you get. For example, heavy gaming use is going to drive your choice of graphics card. Longevity is relevant to me, because I don’t want to be changing things out anytime soon for the next hot game. The consequence of performance and longevity is cost. Finally, aesthetics was my next consideration. I could have just built a box, but it wouldn’t have been nearly as fun for me. Aesthetics might not be important to you. They are for me. I also like gadgets and that played into at least one purchase for this build. I used PC Part Picker to put together my component list. I found it invaluable during the process and I’d recommend it to everyone. One caveat is that I wouldn’t trust the compatibility aspects. It does a pretty good job of not steering you wrong, but do your own research. The rest of it isn’t really sexy. I started out with what appealed to me and then I made changes and additions as I dived deep into researching each component and interaction I could find. The resources I used are innumerable. I used reviews, product descriptions, forum posts (praises and problems), et al. to assist me. I also asked friends into gaming what they thought about my component list. And when I got near the end I posted my list to the Reddit /r/buildapc forum. I cannot stress the value of extra sets of eyeballs and first hand experiences. Some of the resources I used: PC Part Picker Tom’s Hardware bit-tech Reddit Purchase PC Part Picker favors certain vendors. You should look at others too. In my case I found their favorites to be the best. My priorities were out-the-door price and shipping time. I knew that once I started getting parts I’d want to start building. Luckily, I timed it well and everything arrived within the span of a few days. Here are my opinions on the vendors I ended up using in alphabetical order. Amazon.com is a good, reliable choice. They have excellent customer service in my experience, and I knew I wouldn’t have trouble with them. However, shipping time is often a problem when you use their free shipping unless you order expensive items (I’ve found items over $100 ship quickly). Ultimately though, price wasn’t always the best and their collection of sales tax in my state turned me off them. I did purchase my case from them. I ordered the mouse as well, but I cancelled after it was stuck four days in a “shipping soon” state. I purchased the mouse locally. Best Buy is not my favorite place to do business. There’s a lot of history with poor, uninterested sales representatives and they used to have a lot of bad anti-consumer policies. That’s a lot better now, but the bad taste is still in my mouth. I ended up purchasing the accessories from them including mouse (locally) and headphones. NCIX is a company that I’ve never heard of before. It popped up as a recommendation for my CPU cooler on PC Part Picker. I didn’t do a lot of research on the company, because their policy on you buying insurance for your orders turned me off. That policy makes it clear to me that the company finds me responsible for the shipment once it leaves their dock. That’s not right, and may run afoul of state laws. Regardless they shipped my CPU cooler quickly and I didn’t have a problem. NewEgg.com is a well known company. I had never done business with them, but I’m glad I did. They shipped quickly and provided good visibility over everything. The prices were also the best in most cases. My main complaint is that they have a lot of exchange only return policies on components. To their credit those policies are listed in the cart underneath each item. The visibility tells me that they’re not playing any shenanigans and made me comfortable dealing with that risk. The vast majority of what I ordered came from them. Coming Next In the next part I’ll tackle my build experience.

    Read the article

  • Unable to build my c++ code with g++ 4.6.3

    - by Mriganka
    I am facing multiple issues with building my c++ code on Ubuntu 12.04. This code was building and running fine on RH Enterprise. I am using g++ 4.6.3. Here's the output of g++ -v. g++ -v Using built-in specs. COLLECT_GCC=g++ COLLECT_LTO_WRAPPER=/usr/lib/gcc/i686-linux-gnu/4.6/lto-wrapper Target: i686-linux-gnu Configured with: ../src/configure -v --with-pkgversion='Ubuntu/Linaro 4.6.3-1ubuntu5' --with-bugurl=file:///usr/share/doc/gcc-4.6/README.Bugs --enable-languages=c,c++,fortran,objc,obj-c++ --prefix=/usr --program-suffix=-4.6 --enable-shared --enable-linker-build-id --with-system-zlib --libexecdir=/usr/lib --without-included-gettext --enable-threads=posix --with-gxx-include-dir=/usr/include/c++/4.6 --libdir=/usr/lib --enable-nls --with-sysroot=/ --enable-clocale=gnu --enable-libstdcxx-debug --enable-libstdcxx-time=yes --enable-gnu-unique-object --enable-plugin --enable-objc-gc --enable-targets=all --disable-werror --with-arch-32=i686 --with-tune=generic --enable-checking=release --build=i686-linux-gnu --host=i686-linux-gnu --target=i686-linux-gnu Thread model: posix gcc version 4.6.3 (Ubuntu/Linaro 4.6.3-1ubuntu5) Here's a sample of my code: #include "Word.h" #include < string> using namespace std; pthread_mutex_t Word::_lock = PTHREAD_MUTEX_INITIALIZER; Word::Word(): _occurrences(1) { memset(_buf, 0, 25); } Word::Word(char *str): _occurrences(1) { memset(_buf, 0, 25); if (str != NULL) { strncpy(_buf, str, strlen(str)); } } g++ -c -ansi or g++ -c -std=c++98 or g++ -c -std=c++03, none of these options are able to build the code correctly. I get the following compilation errors: mriganka@ubuntu:~/WordCount$ make g++ -c -g -ansi Word.cpp -o Word.o Word.cpp: In constructor ‘Word::Word()’: Word.cpp:10:21: error: ‘memset’ was not declared in this scope Word.cpp: In constructor ‘Word::Word(char*)’: Word.cpp:16:21: error: ‘memset’ was not declared in this scope Word.cpp:19:34: error: ‘strlen’ was not declared in this scope Word.cpp:19:35: error: ‘strncpy’ was not declared in this scope Word.cpp: In member function ‘void Word::operator=(const Word&)’: Word.cpp:37:42: error: ‘strlen’ was not declared in this scope Word.cpp:37:43: error: ‘strncpy’ was not declared in this scope Word.cpp: In copy constructor ‘Word::Word(const Word&)’: Word.cpp:44:21: error: ‘memset’ was not declared in this scope Word.cpp:45:52: error: ‘strlen’ was not declared in this scope Word.cpp:45:53: error: ‘strncpy’ was not declared in this scope So basically g++ 4.6.3 on Ubuntu 12.04 is not able to recognize the standard c++ headers. And I am not finding a way out of this situation. Second problem: In order to make progress, I included < string.h instead of < string. But now I am facing linking errors with my message queue and pthread library functions. Here's the error that I am getting: mriganka@ubuntu:~/WordCount$ make g++ -c -g -ansi Word.cpp -o Word.o g++ -lrt -I/usr/lib/i386-linux-gnu Word.o HashMap.o main.o -o word_count main.o: In function `main': /home/mriganka/WordCount/main.cpp:75: undefined reference to `pthread_create' /home/mriganka/WordCount/main.cpp:90: undefined reference to `mq_open' /home/mriganka/WordCount/main.cpp:93: undefined reference to `mq_getattr' /home/mriganka/WordCount/main.cpp:113: undefined reference to `mq_send' /home/mriganka/WordCount/main.cpp:123: undefined reference to `pthread_join' /home/mriganka/WordCount/main.cpp:129: undefined reference to `mq_close' /home/mriganka/WordCount/main.cpp:130: undefined reference to `mq_unlink' main.o: In function `count_words(void*)': /home/mriganka/WordCount/main.cpp:151: undefined reference to `mq_open' /home/mriganka/WordCount/main.cpp:154: undefined reference to `mq_getattr' /home/mriganka/WordCount/main.cpp:162: undefined reference to `mq_timedreceive' collect2: ld returned 1 exit status Here's my makefile: CC=g++ CFLAGS=-c -g -ansi LDFLAGS=-lrt INC=-I/usr/lib/i386-linux-gnu SOURCES=Word.cpp HashMap.cpp main.cpp OBJECTS=$(SOURCES:.cpp=.o) EXECUTABLE=word_count all: $(SOURCES) $(EXECUTABLE) $(EXECUTABLE): $(OBJECTS) $(CC) $(LDFLAGS) $(INC) -pthread $(OBJECTS) -o $@ .cpp.o: $(CC) $(CFLAGS) $< -o $@ clean: rm -f *.o word_count Please help me to resolve both the issues. I searched online relentlessly for any solution of these problems, but no one seems to have encountered these issues.

    Read the article

  • Microsoft Build 2012 Day 1 Keynote Summary

    - by Tim Murphy
    So I have finally dried the tears after watching the Keynote for Build 2012.  This wasn’t because it was an emotional presentation, but because for the second year I missed the goodies.  Each on site attendee got a Surface RT, a Lumia 920 and a voucher for 100GB of SkyDrive storage. The event was opened with the announcement that in the three days since the launch of Windows 8 over 4 million upgrades have been sold.  I don’t care who you are that is an impressive stat.  Ballmer then spent a fair amount of time remaking the case for the Windows and Windows Phone platforms similar to what we have heard over the last to launch events. There were some cool, but non-essential demos.  The one that was the most fun was the Perceptive Pixel 82” slate device.  At first glance I wondered why I would ever want such a device, but then Ballmer explained it’s possible use for schools and boardrooms.  The actually made sense. Then things got strange.  Steve started explaining features that developers could leverage.  Usually this type of information is left to the product leads.  He focused on the integration with the Charms features such as Search and Share. Steve “Guggs” Guggenheim showed off an app that would appeal to my kids from Disney called “Agent P” which is base on Phineas and Ferb.  Then he got to the meat of the presentation.  We found out that you could add a tile that can be used to sell ad space.  In the same vein we also found out that you could use Microsoft’s, Paypal’s or any commerce engine of your own creation or choosing. For those who are interested in sports and especially developing sports apps you would have found the small presentation from Michael Bayle of ESPN.  He introduced the ESPN app which has tons of features.  For the developers in the crowd he also mentioned that ESPN has an API available at developer.espn.com. During the launch events we were told apps were coming.  In this presentation we were actually shown a scrolling list of logos and told about a couple of them.  Ballmer mentioned specifically Twitter, SAP and DropBox.  These are impressive names that were just a couple of the list impressive names. Steve Ballmer addressed the question of why you should develop for the Windows 8 platform.  He feels that Microsoft has the best commercial terms for developers, a better way to build apps than other platforms and a variety of form factors.  His key point though was the available volume of customers given the current Windows install base and assuming even a flat growth of the platform.  This he backed with a promise that Microsoft is going to do better at marketing and you won’t be able to avoid the ads that they are bringing out. The last section of the key note was present by Kevin Gallo from the Windows Phone team.  This was the real reason I tuned into the webcast.  He impressed upon those watching that the strength of developing for the Microsoft platform is the common programming model that now exist.  While there are difference between form factor implementations you can leverage code across them. He claimed that 90% of developer requests for Windows Phone 8 had been implemented.  These include: More controls with better performance Better live tiles including lock screen integration Speech support in custom apps Easier submission to the market place App camera integration VOIP and chat support Bluetooth and NFC support Native C++ development Direct 3D development   The quote from Kevin that stood out for me was that “Take your Dramamine and buckle your seatbelt type of games are coming to Windows Phone 8”.  He back this up by displaying a list of game development frameworks and then having Unity come out and do a demo. Ok, almost done … The last two things of note for me were the announcement that the SDK is immediately available at dev.windowsphone.com and that they were reducing the cost of an individual developer account to $8 for the next 8 days. Let the development commence. del.icio.us Tags: Build 2012,Windows 8,Windows Phone 8,Windows Phone

    Read the article

  • Exception using Querytables in Excel Automation

    - by sam
    Hi, I'm using Automation to populate a range in Excel using a querytable.. However, when I try to add a querytable to the qorksheet I get an exception. I checked the connection string and its working fine.. Can some one please help me with this?? public void writeproc1() { try { Worksheet ws = (Worksheet)wb.Worksheets.Add(Missing.Value, Missing.Value, Missing.Value, Missing.Value); Range rng = ws.get_Range("A1", "E14"); QueryTable qt = ws.QueryTables.Add("Data Source=(local)\\SQLEXPRESS;initial catalog=temp;Integrated Security=SSPI;", rng, "Select * From Table_1"); qt.RefreshStyle = XlCellInsertionMode.xlInsertEntireRows; qt.Refresh(false); } catch (Exception ex) { Console.WriteLine(ex.ToString()); Console.ReadKey(); } } Exception Thrown System.Runtime.InteropServices.COMException (0x800A03EC): Exception from HRESULT: 0x800A03EC at System.RuntimeType.ForwardCallToInvokeMember(String memberName, BindingFlags flags, Object target, Int32[] aWrapperTypes, MessageData& msgData) at Microsoft.Office.Interop.Excel.QueryTables.Add(Object Connection, Range Destination, Object Sql) at tmp.Program.writeproc1() in ...Projects\tmp\tmp\Program.cs:line 25

    Read the article

  • Word automation - SaveAs

    - by nXqd
    I try to write a simple MFC - Word Automation to save for every 1 minute. I follow this article : http://www.codeproject.com/KB/office/MSOfficeAuto.aspx And this is what Im trying to implement , I'm new to COM so I think there's problem here: my VBA is generated by Word 2010: ActiveDocument.SaveAs2 FileName:="1.docx", FileFormat:=wdFormatXMLDocument _ , LockComments:=False, Password:="", AddToRecentFiles:=True, _ WritePassword:="", ReadOnlyRecommended:=False, EmbedTrueTypeFonts:=False, _ SaveNativePictureFormat:=False, SaveFormsData:=False, SaveAsAOCELetter:= _ False, CompatibilityMode:=14 And my code to implement VBA code above : { COleVariant varName(L"b.docx"); COleVariant varFormat(L"wdFormatXMLDocument"); COleVariant varLockCmt((BYTE)0); COleVariant varPass(L""); COleVariant varReadOnly((BYTE)0); COleVariant varEmbedFont((BYTE)0); COleVariant varSaveNativePicFormat((BYTE)0); COleVariant varForms((BYTE)0); COleVariant varAOCE((BYTE)0); VARIANT x; x.vt = VT_I4; x.lVal = 14; COleVariant varCompability(&x);; VARIANT result; VariantInit(&result); _hr=OLEMethod( DISPATCH_METHOD, &result, pDocApp, L"SaveAs2",10, varName.Detach(),varFormat.Detach(),varLockCmt.Detach(),varPass.Detach(),varReadOnly.Detach(), varEmbedFont.Detach(),varSaveNativePicFormat.Detach(),varForms.Detach(),varAOCE.Detach(),varCompability.Detach() ); } I get no error from this one, but it doesn't work.

    Read the article

  • postgresql table for storing automation test results

    - by Martin
    I am building an automation test suite which is running on multiple machines, all reporting their status to a postgresql database. We will run a number of automated tests for which we will store the following information: test ID (a GUID) test name test description status (running, done, waiting to be run) progress (%) start time of test end time of test test result latest screenshot of the running test (updated every 30 seconds) The number of tests isn't huge (say a few thousands) and each machine (say, 50 of them) have a service which checks the database and figures out if it's time to start a new automated test on that machine. How should I organize my SQL table to store all the information? Is a single table with a column per attribute the way to go? If in the future I need to add attributes but want to keep compatibility with old database format (ie I may not want to delete and create a new table with more columns), how should I proceed? Should the new attributes just be in a different table? I'm also thinking of replicating the database. In case of failure, I don't mind if the latest screenshots aren't backed up on the slave database. Should I just store the screenshots in its own table to simplify the replication? Thanks!

    Read the article

  • Office Automation: What does destroy my encoding?

    - by Filburt
    I'm facing a problem with a Word Mail Merge Automation controlled by our CRM system. The setup Base for the Mail Merge is a Word .dot template which fires a macro on Document.New. Inside this macro I create a .Net component registered for COM. Set myCOMObject = CreateObject("MyCOMObject") The component pulls some data from a database and hands string values which are assigned to Word DocumentVariables. Set someClass = myCOMObject.GetSomeClass(123) ActiveDocument.Variables("docaddress") = someClass.GetSenderAddress(456) All string values returned from the component are encoded in UTF-8 (codepage 1200). What happens The problem arises when the CRM system calls Word to perform the Mail Merge: The string values from the component are turned into UTF-8 encoded strings. All the static text inside the template and the data pulled for the Mail Merge stay nicely encoded in UTF-16 - example the umlaut ü inside my DocumentVariables is turned into c3 b0 while it stays fc for the rest of the document (checked file in hex editor). If I'm creating a document from a template with the same macro functionallity but without performing a Mail Merge all strings are fine; i.e. are encoded in UTF-16. What changed According to the CRM software vendor the encoding of the Mail Merge data export was changed to UTF-16 with the new version we're currently testing. I found out that MS states that you'll expirience issues when the document and the Mail Merge data file encoding don't match. What I tried Since I'm assuming to merge with UTF-16 encoded data I added the following lines to my macro: ActiveDocument.TextEncoding = msoEncodingWestern ActiveDocument.SaveEncoding = msoEncodingUnicodeLittleEndian This is what the Mail Merge data document specifies in its document properties.

    Read the article

  • Excel automation: Close event missing

    - by chiccodoro
    Another hi all, I am doing Excel automation via Interop in C#, and I want to be informed when a workbook is closed. However, there is no Close event on the workbook nor a Quit event on the application. Has anybody done that before? How can I write a piece of code which reacts to the workbook being closed (which is only executed if the workbook is really closed)? Ideally that should happen after closing the workbook, so I can rely on the file to reflect all changes. Details about what I found so far: There is a BeforeClose() event, but if there are unsaved changes this event is raised before the user being asked whether to save them, so at the moment I can process the event, I don't have the final file and I cannot release the COM objects, both things that I need to have/do. I do not even know whether the workbook will actually be closed, since the user might choose to abort closing. Then there is a BeforeSave() event. So, if the user chooses "Yes" to save unsaved changes, then BeforeSave() is executed after BeforeClose(). However, if the user chooses to "Abort", then hits "file-save", the exact same order of events is executed. Further, if the user chooses "No", the BeforeSave() isn't executed at all. The same holds as long as the user doesn't click any of these options.

    Read the article

  • Evaluating expressions using Visual Studio 2005 SDK rather than automation's Debugger::GetExpression

    - by brone
    I'm looking into writing an addin (or package, if necessary) for Visual Studio 2005 that needs watch window type functionality -- evaluation of expressions and examination of the types. The automation facilities provide Debugger::GetExpression, which is useful enough, but the information provided is a bit crude. From looking through the docs, it sounds like an IDebugExpressionContext2 would be more useful. With one of these it looks as if I can get more information from an expression -- detailed information about the type and any members and so on and so forth, without having everything come through as strings. I can't find any way of actually getting a IDebugExpressionContext2, though! IDebugProgramProvider2 sort of looks relevant, in that I could start with IDebugProgramProvider2::GetProviderProcessData and then slowly drill down until reaching something that can supply my expression context -- but I'll need to supply a port to this, and it's not clear how to retrieve the port corresponding to the current debug session. (Even if I tried every port, it's not obvious how to tell which port is the right one...) I'm becoming suspicious that this simply isn't a supported use case, but with any luck I've simply missed something crashingly obvious. Can anybody help?

    Read the article

  • .Net/C# Build Tool - Is NAnt a preferred tool?

    - by Olle
    I'm about to set up an automatic build of a .net/C# project. I've searched the net quite a bit, and there are a lot of references to this tool called 'NAnt'. My questions are: Is NAnt considered a good tool for this, is it still used? Are there other toos that are the de facto standard for such a task? From the information on the projects's sourceforge page, it doesn't seem to have been much development going on the lates years. The same applies to the NAntContrib project. Thanks!

    Read the article

  • Insufficient Permissions Problems with MSDeploy and TFS Build 2010

    - by jdanforth
    I ran into these problems on a TFS 2010 RC setup where I wanted to deploy a web site as part of the nightly build: C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v10.0\Web\Microsoft.Web.Publishing.targets (3481): Web deployment task failed.(An error occurred when reading the IIS Configuration File 'MACHINE/REDIRECTION'. The identity performing the operation was 'NT AUTHORITY\NETWORK SERVICE'.)  An error occurred when reading the IIS Configuration File 'MACHINE/REDIRECTION'. The identity performing the operation was 'NT AUTHORITY\NETWORK SERVICE'. Filename: \\?\C:\Windows\system32\inetsrv\config\redirection.config Error: Cannot read configuration file due to insufficient permissions  As you can see I’m running the build service as NETWORK SERVICE which is quite usual. The first thing I did then was to give NETWORK SERVICE read access to the whole directory where redirection.config is sitting; C:\Windows\system32\inetsrv\config. That gave me a new error: C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v10.0\Web\Microsoft.Web.Publishing.targets (3481): Web deployment task failed. (Attempted to perform an unauthorized operation.) The reason for this problem was that NETWORK SERVICE didn’t have write permission to the place where I’ve told MSDeploy to put the web site physically on the disk. Once I’d given the NETWORK SERVICE the right permissions, MSDeploy completed as expected! NOTE! I’ve not had this problem with TFS 2010 RTM, so it might be just a RC issue!

    Read the article

  • Install Domain Controller – Part1 of build my own development SharePoint2010 Farm

    - by ybbest
    As the memory become really cheap now, a couple days ago I have updated my laptop memory to 12g. Plus I got my old desktop ,now I decide to build my own SharePoint farm at home. I decide to document the steps to build a simple SharePoint farm. I will use windows server 2008 r2 and VMware. In the first part of this series of building my own SharePoint farm. I will create my domain controller. Here are the steps to install it: Open the command line by going to run and type CMD and then type dcpromo in the command line. The AD Installation wizard will prompt and click next. 2. Click next as shown in the screenshot.   3. Select creates a new domain in a new forest and click next.      4. Type a domain name (e.g. ybbest.com) and click next. 5.In my case , I select Windows Server 2008 R2 forest Functional level and click next 6. Leave the default and click next.(If you have not make a static IP address , you need to do so now)      7.You might get scary prompt like the screenshot below , just ignore the message and click Yes.     8.Leave the default settings and click Next  9.Type a password when you need to restore your Domain        10.Click Next and restart your computer ,this will install your Domain Controller.

    Read the article

  • TFS 2010 Build: Dealing with the API restriction error

    - by Jakob Ehn
    Recently I’ve come across this error a couple of times when running builds that exeucte unit tests using Test containers: API restriction: The assembly 'file:///C:\Builds\<path>\myassembly.dll' has already loaded from a different location. It cannot be loaded from a new location within the same appdomain. Every time I’ve got this error, the project has been a web application, and the path to the assembly points down to the _PublishedWebsites directory that is created beneath the Binaries folder during a team build. The error description really says it all (although slightly cryptic), when using test containers, MSTest needs to load all assemblies and see if they contain any unit tests. During this serach, it finds the ‘myassembly.dll’ in two different locations. First it is found directly beneth the Binaries folder, and then it is alos found beneath the _PublishedWebsites\Project\bin folder. The reason is that the default setting for test containers in a TFS 2010 build definition is **\*test*.dll:   This pattern means that MSTest will search recursively for all assemblies beneath the Binaries folder, and during the search it will find the MyAssembly.dll twice. The solution is simple, set the Test assembly file specification property to *test*.dll instead, this will disable the recursive search:

    Read the article

  • BUILD 2012 day 1 Keynote recap

    - by pluginbaby
    On October 30, 2012 Steve Ballmer kicked off the first BUILD conference keynote. Steve shared some insights around Windows 8: 4 million customers upgraded to Windows 8 over the weekend since the October 26 release (so in 3 days only!). Focus on sharing code between Windows 8 and Windows Phone 8. Syncing everything through SkyDrive Xbox Music free streaming and Xbox Smart Glass. He did all the demos himself, showing off great “Windows 8 generation” devices already available (including an 82-inch Windows 8 “slate” by Perceptive Pixel). Steve Guggenheimer (Microsoft's Corporate Vice President DPE) talked about The Business Opportunity with Windows 8.   Notable announcements of day 1: The Windows Phone 8 SDK is now available at dev.windowsphone.com (includes SDK, free version of VS2012, Blend 5, and emulators). Release of the .NET Framework for Windows Phone 8: Ability to use C# 5 or Visual Basic 11 features in your code (async programming mode, ...), share code between WP8 and Windows Store apps. Windows Phone 8 individual developer registration is reduced to $8 for the next 8 days! (hurry up…) Note: strange absence of Steven Sinofsky on stage…   Watch the entire keynote online: http://channel9.msdn.com/Events/Build/2012/1-001 Read the full transcript: http://www.microsoft.com/en-us/news/Speeches/2012/10-30BuildDay1.aspx

    Read the article

  • How to work with Firefox nightly build [minefield]

    - by anirudha
    Most of developer love Firefox for their plugin who support for making development easier and faster. many of us use Firefox nightly build aka minefield who update daily. there is a little problem in minefield that there is less plugin on addons site of mozilla for Firefox nightly [minefield]. a solution for this problem is that you need to install the plugin who you want in nightly build from the developer site instead of add-ons site of Mozilla. the reason for that is all many of popular plugin in development for support Firefox 4. How to install follow the thing as mine : go to minefield > addons or toolbar  > addons you see a list of plugin who is not suitable for minefield now go to more link of plugin who you want to get work in Firefox 4. When you click their you go to plugin page on addons site of mozilla or maybe redirect you to developer site. if they show you plugin page then go to developer site by click on link more about developer. now you find that on the developer site they give you the same version or maybe new then you found on mozilla addons site. install the plugin and restart the minefield. you see that most of plugin now work.

    Read the article

  • EPPM Webcast Series Part II: Build – Consistently delivering successful projects to ensure financial success

    - by Sylvie MacKenzie, PMP
    Oracle Primavera invites you to the second in a series of three webcasts linking Enterprise Project Portfolio Management with enhanced operational performance and better financial results. Join us for the next installment of our 'Plan, Build, Operate' webcast series, as we look to address the challenges organizations face during the execution phase of their projects. Webcast II: Build – Consistently delivering successful projects to ensure financial success. This webcast will look at Three key questions: How do you maintain consistency in delivery whilst maintaining visibility and control? How do you deal with project risk and mitigation strategies? How do you ensure accurate reporting? Hear from Geoff Roberts, Industry Strategist from Oracle Primavera. Geoff will look at how solutions can help to address the challenges around: Visibility and governance Communication and complex coordination Collecting and reporting progress Measuring and reporting It is imperative that organizations understand the impact projects can have on their business. Attend this webcast and understand how consistently delivering successful projects is vital to the financial success of an asset intensive organisation. Register today! Please forward this invite to your colleagues who you think may benefit from attending.

    Read the article

  • Using Sandcastle to build code contracts documentation

    - by DigiMortal
    In my last posting about code contracts I showed how code contracts are documented in XML-documents. In this posting I will show you how to get code contracts documented with Sandcastle and Sandcastle Help File Builder. Before we start, let’s download Sandcastle tools we need: Sandcastle Sandcastle Help File Builder Install Sandcastle first and then Sandcastle Help File Builder. Because we are generating only HTML based documentation we upload to server we don’t need any other tools. Of course, we need Cassini or IIS, but I expect it to be already there in your machine. Open your project and turn on XML-documentation for project and contracts. Now let’s run Sandcastle Help File Builder. We have to create new project and add our Visual Studio solution to this project. Now set the HelpFileFormat parameter value to be Website and let builder build the help. You have to wait about two or three minutes until help is ready. Take a look at your documentation that Sandcastle generated – you see not much information there about code contracts and their rules. Enabling code contracts documentation Now let’s include code contracts to documentation. Follow these steps: Open Sandcastle folder and make copy of vs2005 folder. Open CodeContracts folder (c:\program files\microsoft\contracts\) and unzip the archive from sandcastle folder. Copy all unzipped files to Sandcastle folder. Create (yes, create new) and build your Sandcastle Help File Builder documentation project again. Open help. In my case I see something like this now. As you can see then contracts are documented pretty well. We can easily turn on code contracts XML-documentation generation and all our contracts are documented automatically. To get documentation work we had to use Sandcastle help file fixes that are installed with code contracts and if we had previously Sandcastle Help File Builder project we had to create it from start to get new rules accepted. Once the documentation support for contracts works we have to do nothing more to get contracts documented.

    Read the article

  • How to get multiple open-source projects to use a standard way of doing something.

    - by Marco
    Problem In the last couple weeks, I've used 3 different "repository" tools (listed in alphabetical order): gradle ivy maven I'm calling them "repository" tools because I've also used sbt -- which fortunately uses ivy to manage it's cache or local repository. Each of these tools will create it's own repository. The defaults are: ~/.m2/repository for maven ~/.gradle/cache ~/.ivy2/cache Why can't they all use the same cache? Goal I'd like to change the world so that all three build tools could use the same cache. I'm looking for advice about issues I'm likely to run into and smart ways to get around them. By "use the same cache", I do not mean "retrieve from another build tool's cache". I mean "retrieve from and store in another build tool's cache". While I could go ahead and submit issues to the three projects, I know from experience (as a developer on an open source project), that if you want something done, you're best off getting it done yourself. Also, it seems like I need to get all 3 communities on board to some degree. What is the recommended approach for getting this kind of thing done? How do I approach the different communities? Do I work on patches for the 3 different projects, or would it be better off to create my own "interface" project that deals with these issues and have the 3 tools interface with that? Is this a standards question that I need to address on that front? Lastly, if I'm missing something and this is possible (in an globally configurable fashion), then please let me know.

    Read the article

  • Cleaning your BizTalk Build Server

    - by Michael Stephenson
    Just a little note for myself this one.At one of my customers where it is still BizTalk 2006 one of the build servers is intermittently getting issues so I wanted to run a script periodically to clean things up a little.  The below script is an example of how you can stop cruise control and all of the biztalk services, then clean the biztalk databases and reset the backup process and then click everything off again.This should keep the server a little cleaner and reduce the number of builds that occasionally fail for adhoc environmental issues.REM Server Clean ScriptREM =================== REM This script is ran to move the build server back to a clean state echo Stop Cruise Controlnet stop CCService echo Stop IISiisreset /stop echo Stop BizTalk Servicesnet stop BTSSvc$<Name of BizTalk Host><Repeat for other BizTalk services> echo Stop SSOnet stop ENTSSO echo Stop SQL Job Agentnet stop SQLSERVERAGENT echo Clean Message Boxsqlcmd -E -d BizTalkMsgBoxDB -Q "Exec bts_CleanupMsgbox"sqlcmd -E -d BizTalkMsgBoxDB -Q "Exec bts_PurgeSubscriptions"  echo Clean Tracking Databasesqlcmd -E -d BizTalkDTADb -Q "Exec dtasp_CleanHMData" echo Reset TDDS Stream Statussqlcmd -E -d BizTalkDTADb -Q "Update TDDS_StreamStatus Set lastSeqNum = 0" echo Force Full Backupsqlcmd -E -d BizTalkMgmtDB -Q "Exec sp_ForceFullBackup" echo Clean Backup Directorydel E:\BtsBackups\*.* /q  echo Start SSOnet start ENTSSO echo Start SQL Job Agentnet start SQLSERVERAGENT echo Start BizTalk Servicesnet start BTSSvc$<Name of BizTalk Host><Repeat for other BizTalk services> echo Start IISiisreset /start echo Start Cruise Controlnet start CCService

    Read the article

  • Run Grunt task in Visual Studio Release Build with a bat file

    - by Aligned
    Originally posted on: http://geekswithblogs.net/Aligned/archive/2014/08/19/run-grunt-task-in-visual-studio-release-build-with-a.aspx 1. Add a BeforeBuild in your csproj file. Edit the xml with a text editor. <Target Name="BeforeBuild"> <Exec Condition="'$(Configuration)' == 'Release'" Command="script-optimize.bat" /> </Target> 2. Create the script-optimize.batREM "%~dp0" maps to the directory where this file exists cd %~dp0\..\YourProjectFolder call npm uninstall grunt call npm uninstall grunt call npm install --cache-min 604800 -g grunt-cli call npm install --cache-min 604800 grunt typescript requirejs copy less:compile less:mincompileThis grunt command will compile typescript, run the requireJs optimizer, complie and minimize less.3. Make it use the minified code when the Web.config compilation debug is set to false <!-- These CustomCollectFiles actions are used so that the Scripts-Release folder/files are included        when publishing even though they are not project references -->  <Target Name="CustomCollectFiles">    <ItemGroup>      <_CustomFiles Include="Scripts-Release\**\*" />  </ItemGroup>  </Target> That should be all you need to get a Grunt task to minify and combine JS (plus other tasks) in Visual Studio Release build with debug = false. This is a great video of Steve Sanderson talking about SPAs, npm, Knockout, Grunt, Gulp, ect. I highly recommend it.

    Read the article

  • So You Want To Build a SPARC Cloud

    - by user12601629
    Did you ever wish you could get the industrial strength power of UNIX/RISC with the flexibility of cloud computing?  Well, now you can!  With recent advances from Oracle it's possible to build an incredibly high-performance, flexible, available virtualized infrastructure based on Solaris and SPARC.  Here's the recipe! Authored in collaboration across the Oracle "Systems Group" team, we now have a complete best practice guide for you.  Click below to download it: Best Practices for Building a Virtualized SPARC Computing Environment Inside you'll find recommendations for how and when to leverage technologies like: SPARC T4 OVM for SPARC hypervisor (version 2.2 and newer) Solaris 11 Ops Center 12c ZFS Storage Appliance Oracle network switches Based on following these best practices, you'll be able to construct a dynamic, virtualized infrastructure that allows for: Easy, GUI-based provisioning on new VMs Automated HA failover in the event of physical server failures Automatic load balancing across a cluster of VM hosts Complete end-to-end monitoring You should download this paper and check it out.  Even if you aren't planning on buying all new hardware, and instead want to transform some existing gear into a dynamic virtualized environment then this paper will give you concrete info on what to do and the trade-offs you'll make. Have fun getting started on your journey to build a SPARC cloud!

    Read the article

  • Any way to avoid creating a huge C# COM interface wrapper when only a few methods needed?

    - by Paul Accisano
    Greetings all, I’m working on a C# program that requires being able to get the index of the hot item in Windows 7 Explorer’s new ItemsView control. Fortunately, Microsoft has provided a way to do this through UI Automation, by querying custom properties of the control. Unfortunately, the System.Windows.Automation namespace inexplicably does not seem to provide a way to query custom properties! This leaves me with the undesirable position of having to completely ditch the C# Automation namespace and use only the unmanaged COM version. One way to do it would be to put all the Automation code in a separate C++/CLI module and call it from my C# application. However, I would like to avoid this option if possible, as it adds more files to my project, and I’d have to worry about 32/64-bit problems and such. The other option is to make use of the ComImport attribute to declare the relevant interfaces and do everything through COM-interop. This is what I would like to do. However, the relevant interfaces, such as IUIAutomation and IUIAutomationElement, are FREAKING HUGE. They have hundreds of methods in total, and reference tons and tons of interfaces (which I assume I would have to also declare), almost all of which I will never ever use. I don’t think the UI Automation interfaces are declared in any Type Library either, so I can’t use TLBIMP. Is there any way I can avoid having to manually translate a bajillion method signatures into C# and instead only declare the ten or so methods I actually need? I see that C# 4.0 added a new “dynamic” type that is supposed to ease COM interop; is that at all relevant to my problem? Thanks

    Read the article

  • How to check for an existing executable before running it in a post-build event in VS2008?

    - by wtaniguchi
    Hey all, I'm trying to use SubWCRev to get the current revision number of our SVN repository and put it in a file so I can show it in the UI. As I'm working with a Web App, I use the following post build command line: "SubWCRev.exe" "$(SolutionDir)." "$(ProjectDir)Content\js\revnumber.js.tpl" "$(ProjectDir)Content\js\revnumber.js" It works great, but now I want to make sure I have SubWCRev before running it, so I can skip this post build if a fellow developer is not running TortoiseSVN. I tried a few batch codes here, but couldn't figure this out. Any ideas? Thanks!

    Read the article

  • Why does "enable-shared failed" happen on libjpeg build for os X?

    - by BryanWheelock
    I'm trying to install libjpeg on os X to fix a problem with the Python Imaging Library JPEG setup. I downloaded libjpeg from http://www.ijg.org/files/jpegsrc.v7.tar.gz I then began to setup the config file cp /usr/share/libtool/config.sub . cp /usr/share/libtool/config.guess . ./configure –enable-shared However, the enable-shared flag didn't seem to work. $ ./configure –-enable-shared configure: WARNING: you should use --build, --host, --target configure: WARNING: invalid host type: –-enable-shared checking build system type... Invalid configuration `–-enable-shared': machine `–-enable' not recognized configure: error: /bin/sh ./config.sub –-enable-shared failed I've done lot's of google searches and I can't figure out where the error is or how to work around this error.

    Read the article

< Previous Page | 23 24 25 26 27 28 29 30 31 32 33 34  | Next Page >