Search Results

Search found 28847 results on 1154 pages for 'project organization'.

Page 432/1154 | < Previous Page | 428 429 430 431 432 433 434 435 436 437 438 439  | Next Page >

  • How to add flags to RC.EXE through QMake .pro makefiles

    - by Hernán
    I've the following definition in my .pro file: RC_FILE = app.rc This RC file contains a global include at the top: #include "version_info.h" The version_info.h header is on a common header files directory. Since RC.EXE takes INCLUDE environment variable in consideration, according to MS documentation, my build process batch sets up that accordingly: SET INCLUDE=%PROJECTDIR%\version;%INCLUDE% ... QMAKE project.pro -spec win32-msvc2008 -r CONFIG += release This works perfect as RC seems to read that INCLUDE var so the "version_info.h" file is including on every RC file properly. The problem is when I generate a VS solution (or Import it through the VS Addin). The RC invocation does not contain any /I flag (as I expect) but does not read any INCLUDE variable, even when I've setup through system 'environment variables' dialog in XP. So I'm stuck with this problem, with two alternatives I could not get to work: Make VS RC.exe invocation honour the INCLUDE variable (didn't work either as user or system variable). Force QMAKE to pass /I flag to RC invocation, and get that /I flag imported into the project settings (Resource Compiler properties). Thanks in advance.

    Read the article

  • Why is MonoDevelop compiling with csc.exe?

    - by korchev
    I am trying to use MonoDevelop (2.4 beta 1) on Windows (7 x64) in order to test a .NET application on Mono (2.6.4). For some reason MonoDevelop is not using the Mono tool chain to build the application. It compiles it with the Microsoft tool chain - C:\Windows\Microsoft.NET\Framework\v3.5\csc.exe. The project I am trying to build is a simple ASP.NET MVC application generated from the "New ASP.NET MVC application" template. The "Runtime Version" dropdown in Project \Options-Build-General shows "MONO/.NET 35". What gives? Is there a way to change the .NET tool chain?

    Read the article

  • silverlight 3 navigation page not availble in VS as item to add

    - by Steve Brownell
    I've recently upgraded my computer from Vista Home Premium 64-bit to Windows 7 Home Premium 64-bit. I've re-installed VS 2008 web express, and re-installed all the silver light sdk's, tools, etc. But now when I want to add a Silverlight Navigation Page, it is not avialble to me in the list of items that can be added. The navigation dll is installed, as my project existed before the OS upgrade. The program still runs just fine as is, but I want to add another navigation page item to the project, and I'm stumped for how to do it. Any ideas? Thanks, Steve

    Read the article

  • Team build of Web Projects generates App_web_xxxx.dll files and TFSBuild.Proj Script

    - by Steve Johnson
    Hi all, I have a web application that has some non-web projects as well. When using Web Deployment, a single assembly is generated for all the aspx.vb files. When using Team Build (TS 2008), a lot number App_Web_xxx.dll file(s) are generated instead of a single assembly. How can i solve this problem and change the TFSBuild.proj file so that it can generate a single Web Assembly instead of a lot number of assemblies. Please help. Thanks Edit: I guess thats because the MERGE operation is not occurring like it used to happen for Web Deployment Project in my solution. How can i enable MERGE of App_web_*.dll files into a single Web.dll assembly file and delete the satellite assemblies? Here is my code from TFSBuild.proj file: (MY web project is in Release|.NET Config and all other projects within the solution are in Release|Any CPU) true .\Debug true true Web true false .\Release true true Web true Please tell me what are the corrections i need to do.,

    Read the article

  • Visual Studio Database Professional GDR R2 takes a long time to serialize DBMDL file

    - by Nicolas Webb
    The amount of time it takes to completely serialize the DBMDL (to finish "Your project will be available after 10000 operations are completed) is becoming a hindrance to productivity. I've done what I can to optimize disk activity (excluding my personal TEMP folder from the virus scanner, along with my local source repository). Short of getting a SSD I'm not sure what else I can do along those lines. I believe it has something to do with how the project is organized. The finished DBMDL file is roughly 150MB. Others throughout our organization do not seem to have this issue. Anyone had to deal with this?

    Read the article

  • Unable to set enviornment variable in platform builder win CE 6.0

    - by mukesh
    Actually while building the win ce project getting the two errors -: Error 1 BUILD: [00:0000000015:ERRORE] C:\WINCE600\PLATFORM\ICOP_Vortex86_60CS\SRC\OAL\OALLIB\obj\x86\debug_objects.mac: create file failed. 2 Error 3 BLDDEMO: There were errors building mytest I think it's comes, due to unset of environmnet variable. facing problem to set environment variable IMGRAM128 in the project proerties, giving the error :- "The variable IMGRAM128 is associated with the 128 RAM catalog item.Would you like to set this variable by adding catalog item in OS design? The varible will not be added to the environmmet tab.............."

    Read the article

  • Flex ANT tasks can't find my assets

    - by lach
    I'm attempting to compile my Flex project with an ANT build script. One of my MXML components references an external XML data file, like this: <mx:XML id="treeData" source="assets/data/help.xml" /> When I build the project using Flex Builder, it compiles fine. However, when I try to compile it using ANT, I get the following error: Error: Problem finding external XML: assets/data/help.xml How come ANT isn't finding the XML file? Apparently it knows the source path otherwise it would not have found the component to begin with. I added the source path to the target anyway, but it doesn't seem to have made any difference: <source-path path-element="${SRC}" /> Any ideas?

    Read the article

  • MSBuild appears to only use old output files for custom build tools

    - by sixlettervariables
    I have an ANTLR grammar file as part of a C# project file and followed the steps outlined in the User Manual. <Project ...> <PropertyGroup> <Antlr3ToolPath>$(ProjectDir)tools\antlr-3.1.3\lib</Antlr3ToolPath> <AntlrCleanupPath>$(ProjectDir)AntlrCleanup\$(OutputPath)</AntlrCleanupPath> </PropertyGroup> <ItemGroup> <Antlr3 Include="Grammar\Foo.g"> <OutputFiles>FooLexer.cs;FooParser.cs</OutputFiles> </Antlr3> <Antlr3 Include="Grammar\Bar.g"> <OutputFiles>BarLexer.cs;BarParser.cs</OutputFiles> </Antlr3> </ItemGroup> <Target Name="GenerateAntlrCode" Inputs="@(Antlr3)" Outputs="%(Antlr3.OutputFiles)"> <Exec Command="java -cp %22$(Antlr3ToolPath)\antlr-3.1.3.jar%22 org.antlr.Tool -message-format vs2005 @(Antlr3Input)" Outputs="%(Antlr3Input.OutputFiles)" /> <Exec Command="%22$(AntlrCleanupPath)\AntlrCleanup.exe%22 @(Antlr3Input) %(Antlr3Input.OutputFiles)" /> </Target> <ItemGroup> <!-- ...other files here... --> <Compile Include="Grammar\FooLexer.cs"> <AutoGen>True</AutoGen> <DesignTime>True</DesignTime> <DependentUpon>Foo.g</DependentUpon> </Compile> <Compile Include="Grammar\FooParser.cs"> <AutoGen>True</AutoGen> <DesignTime>True</DesignTime> <DependentUpon>Foo.g</DependentUpon> </Compile> <!-- ... --> </ItemGroup> </Project> For whatever reason, the Compile steps only use old versions of the code, no amount of tweaking appears to help.

    Read the article

  • Richtext Box formating help in WPF using c#.net

    - by Ashwin
    i am making a project for bulk emailing . In this I have added a richtext box . I want users to enter the email message body in it and want to give them the ability to format .Richtext box should be able to make the content bold,italics, underline , align it right left and center , increase font , decrease font,change color, change the font like Times New Roman depending on the click of the button. Help me how should i go about it. also let me know which references i should add to my project.its urgent guys . please help

    Read the article

  • Why not allow mutation of the this binding?

    - by gnucom
    Hi Everyone, I'm building a interpreter/compiler for a school project (well now its turning into a hobby project) and an instructor warned me not to allow mutation of the 'this' binding (he said it was gross and made a huge deal about it) but I never learned why this is so... dangerous or bad. I'm very curious about why this is so bad. I figured this sort of feature could be useful in some way or another. I'm wondering if anyone familiar with building languages can tell me what sort of problems mutation on the 'this' binding can cause, and if they know of any cool or useful tricks that one could do if it actually was allowed. Do any languages that you're aware of allow mutation of 'this'? Thanks,

    Read the article

  • Problem with running php script using mysql on tomcat

    - by Jack
    I am using tomcat 6 with JavaBridge. I have stored my php script in the following location. C:\Program Files\apache-tomcat-6.0.26\webapps\JavaBridge\project\test.php In test.php I am using curl and mysql. The php.ini in JavaBridge is stored in the following location C:\Program Files\apache-tomcat-6.0.26\webapps\JavaBridge\WEB-INF\cgi\php.ini and its contents are - extension_dir="C:\Program Files\apache-tomcat-6.0.26\webapps\JavaBridge\WEB-INF\cgi\x86-windows\ext" include_path="C:\Program Files\apache-tomcat-6.0.26\webapps\JavaBridge\WEB-INF\pear;." there is also a config file called mysql.ini whose contents are - extension = php_mysql.dll I had also installed wamp earlier so I copied all the dll's from C:\wamp\bin\php\php5.3.0\ext to C:\Program Files\apache-tomcat-6.0.26\webapps\JavaBridge\WEB-INF\cgi\x86-windows\ext When I start tomcat and run my script I get the following error - Fatal error: Call to undefined function mysqli_connect() in C:\Program Files\apache-tomcat-6.0.26\webapps\JavaBridge\project\test.php on line 534 Please help.

    Read the article

  • HowTo set Icon to Qt Application, created with Qt Visual Studio Add-in?

    - by mosg
    Hello. Here is what I have: Visual Studio 2008 (on 32-bit Windows XP) Qt libraries 4.6.2 for Windows (VS 2008, 194 MB) Visual Studio Add-in (44 MB) After I installed all the software, I created simple Qt Application project, with Visual Studio: menu File | New | Project... and Qt4 Projects | Qt Application. Build it, and here is the question: how to set application icon to my compiled exe file? I need to see specified ICO in explorer! Old method with MyProject.pro not interested!!! Create a .ico file with both 16x16 and 32x32-pixel versions of the icon (you can do this in Visual Studio). Create a .rc file containing the following text: IDI_ICON1 ICON DISCARDABLE "myIcon.ico" Add the following to your .pro file RC_FILE = myFile.rc Run qmake. Thanks.

    Read the article

  • VS2008 asp.net spits out gibberish, possibly wrong encoding issue.

    - by Edward M Meshuris
    Hello, I have inherited a project, it was originally written in VS2005. I have made a few changes, but all are design. Now when I run the project using the visual studio's web server, in IE8, the page shows up just fine, however in FireFox 3.6.3, I get gibberish (a full page of this): ?I?%&/m?{J?J??t??$?@?????iG#)?*??eVe]f@?? ??{???{??;?N'????\fdl??J??!????~|?"~???????7????t?.???WO???? m???{'w?}?4??????x'}??W?{???G?G?]=?{???j|uo\?w???Pv??? My setup: Windows 7 Proffesional (All patches) VS2008 SP1 (I think all patches) Thank you for your help! -Edward

    Read the article

  • Automatically created NSManagedObject subclasses don't use ARC

    - by Jordan
    My project is ARC enabled (the build settings have Objective-C Reference Counting set to YES). There are no file exceptions to this, it is enabled project wide. (Latest stable version of Xcode). When I create an NSManagedObject subclass via File New for a Core Data entity, the generated header uses the following in its property declarations: @property (nonatomic, retain) But 'retain' is not ARC!! Is this a bug, or is there something I'm missing or not understanding? There are no build warnings - if this is a bug though, how can I remedy it?

    Read the article

  • Best practices for organizing .NET P/Invoke code to Win32 APIs

    - by Paul Sasik
    I am refactoring a large and complicated code base in .NET that makes heavy use of P/Invoke to Win32 APIs. The structure of the project is not the greatest and I am finding DllImport statements all over the place, very often duplicated for the same function, and also declared in a variety of ways: The import directives and methods are sometimes declared as public, sometimes private, sometimes as static and sometimes as instance methods. My worry is that refactoring may have unintended consequences but this might be unavoidable. Are there documented best practices I can follow that can help me out? My instict is to organize a static/shared Win32 P/Invoke API class that lists all of these methods and associated constants in one file... (The code base is made up of over 20 projects with a lot of windows message passing and cross-thread calls. It's also a VB.NET project upgraded from VB6 if that makes a difference.)

    Read the article

  • The Data Scientist

    - by BuckWoody
    A new term - well, perhaps not that new - has come up and I’m actually very excited about it. The term is Data Scientist, and since it’s new, it’s fairly undefined. I’ll explain what I think it means, and why I’m excited about it. In general, I’ve found the term deals at its most basic with analyzing data. Of course, we all do that, and the term itself in that definition is redundant. There is no science that I know of that does not work with analyzing lots of data. But the term seems to refer to more than the common practices of looking at data visually, putting it in a spreadsheet or report, or even using simple coding to examine data sets. The term Data Scientist (as far as I can make out this early in it’s use) is someone who has a strong understanding of data sources, relevance (statistical and otherwise) and processing methods as well as front-end displays of large sets of complicated data. Some - but not all - Business Intelligence professionals have these skills. In other cases, senior developers, database architects or others fill these needs, but in my experience, many lack the strong mathematical skills needed to make these choices properly. I’ve divided the knowledge base for someone that would wear this title into three large segments. It remains to be seen if a given Data Scientist would be responsible for knowing all these areas or would specialize. There are pretty high requirements on the math side, specifically in graduate-degree level statistics, but in my experience a company will only have a few of these folks, so they are expected to know quite a bit in each of these areas. Persistence The first area is finding, cleaning and storing the data. In some cases, no cleaning is done prior to storage - it’s just identified and the cleansing is done in a later step. This area is where the professional would be able to tell if a particular data set should be stored in a Relational Database Management System (RDBMS), across a set of key/value pair storage (NoSQL) or in a file system like HDFS (part of the Hadoop landscape) or other methods. Or do you examine the stream of data without storing it in another system at all? This is an important decision - it’s a foundation choice that deals not only with a lot of expense of purchasing systems or even using Cloud Computing (PaaS, SaaS or IaaS) to source it, but also the skillsets and other resources needed to care and feed the system for a long time. The Data Scientist sets something into motion that will probably outlast his or her career at a company or organization. Often these choices are made by senior developers, database administrators or architects in a company. But sometimes each of these has a certain bias towards making a decision one way or another. The Data Scientist would examine these choices in light of the data itself, starting perhaps even before the business requirements are created. The business may not even be aware of all the strategic and tactical data sources that they have access to. Processing Once the decision is made to store the data, the next set of decisions are based around how to process the data. An RDBMS scales well to a certain level, and provides a high degree of ACID compliance as well as offering a well-known set-based language to work with this data. In other cases, scale should be spread among multiple nodes (as in the case of Hadoop landscapes or NoSQL offerings) or even across a Cloud provider like Windows Azure Table Storage. In fact, in many cases - most of the ones I’m dealing with lately - the data should be split among multiple types of processing environments. This is a newer idea. Many data professionals simply pick a methodology (RDBMS with Star Schemas, NoSQL, etc.) and put all data there, regardless of its shape, processing needs and so on. A Data Scientist is familiar not only with the various processing methods, but how they work, so that they can choose the right one for a given need. This is a huge time commitment, hence the need for a dedicated title like this one. Presentation This is where the need for a Data Scientist is most often already being filled, sometimes with more or less success. The latest Business Intelligence systems are quite good at allowing you to create amazing graphics - but it’s the data behind the graphics that are the most important component of truly effective displays. This is where the mathematics requirement of the Data Scientist title is the most unforgiving. In fact, someone without a good foundation in statistics is not a good candidate for creating reports. Even a basic level of statistics can be dangerous. Anyone who works in analyzing data will tell you that there are multiple errors possible when data just seems right - and basic statistics bears out that you’re on the right track - that are only solvable when you understanding why the statistical formula works the way it does. And there are lots of ways of presenting data. Sometimes all you need is a “yes” or “no” answer that can only come after heavy analysis work. In that case, a simple e-mail might be all the reporting you need. In others, complex relationships and multiple components require a deep understanding of the various graphical methods of presenting data. Knowing which kind of chart, color, graphic or shape conveys a particular datum best is essential knowledge for the Data Scientist. Why I’m excited I love this area of study. I like math, stats, and computing technologies, but it goes beyond that. I love what data can do - how it can help an organization. I’ve been fortunate enough in my professional career these past two decades to work with lots of folks who perform this role at companies from aerospace to medical firms, from manufacturing to retail. Interestingly, the size of the company really isn’t germane here. I worked with one very small bio-tech (cryogenics) company that worked deeply with analysis of complex interrelated data. So  watch this space. No, I’m not leaving Azure or distributed computing or Microsoft. In fact, I think I’m perfectly situated to investigate this role further. We have a huge set of tools, from RDBMS to Hadoop to allow me to explore. And I’m happy to share what I learn along the way.

    Read the article

  • Autotools automatic invocation of lcov after 'make check'

    - by disown
    I have successfully set up an autotools project where the tests compiles with instrumentation so I can get a test coverage report. I can get the report by running lcov in the source dir after a successful 'make check'. I now face the problem that I want to automate this step. I would like to add this to 'make check' or to make it a separate goal 'make check-coverage'. Ideally I would like to parse the result and fail if the coverage falls below a certain percentage. Problem is that I cannot figure out how to add a custom target at all. The closest I got was finding this example autotools config, but I can't see where in that project the goal 'make lcov' is added. I can only see some configure flags in m4/auxdevel.m4. Any tips?

    Read the article

  • SubWebFolder and mutliple bin folders with Website model?

    - by OutOFTouch
    Hi, I am looking for some advice on how what is the best approach to subweb folders and having mutliple bin folders in the WebSite Project model. For adding new pages at a later stage without recompiling the core files of a website and without building a full fledged Plug-in framework api. I am aware of being able to drop in the compiled dlls into the main bin folder and to just copy over the new page files to a sub folder but I am looking for a more organized file/folder approach. Here is the how it was done with WAP: Moving the Code-Behind Assemblies/DLLs to a different folder than /BIN with ASP.NET 1.1 Multiple /bin folders in ASP.NET I should also mention that I see that I can still do it the old way with the website project model by making the adjustment to the config section mentioned here but I was wondering if that has any side affects. AssemblyBinding in Web Config and XMLNS

    Read the article

  • Error on windows using session from appengine-utilities

    - by fredrik
    Hi, I ran across an odd problem while trying to transfer a project to a windows machine. In my project I use a session handler (http://gaeutilities.appspot.com/session) it works fine on my mac but on windows I get: Traceback (most recent call last): File "C:\Program Files (x86)\Google\google_appengine\google\appengine\ext\webapp_init_.py", line 510, in call handler.get(*groups) File "C:\Development\Byggmax.Affiliate\bmaffiliate\admin.py", line 29, in get session = Session() File "C:\Development\Byggmax.Affiliate\bmaffiliate\appengine_utilities\sessions.py", line 547, in init self.cookie.load(string_cookie) File "C:\Python26\lib\Cookie.py", line 628, in load for k, v in rawdata.items(): AttributeError: 'unicode' object has no attribute 'items' Anyone familiar with the Session Handler that knows anything of this? All help are welcome! ..fredrik

    Read the article

  • OCUnit testing an embedded framework

    - by d11wtq
    I've added a Unit Test target to my Xcode project but it fails to find my framework when it builds, saying: Test.octest could not be loaded because a link error occurred. It is likely that dyld cannot locate a framework framework or library that the the test bundle was linked against, possibly because the framework or library had an incorrect install path at link time. My framework (the main project target) is designed to be embedded and so has an install path of @executable_path/../Frameworks. I've marked the framework as a direct dependency of the test target and I've added it to the "Link Binary with Libraries" build phase. Additionally I've add a first step (after it's built the dependency) of "Copy Files" which simply copies the framework to the unit test bundle's Frameworks directory. Anyone got any experience on this? I'm not sure what I've missed.

    Read the article

  • CEN/CENELEC Lacks Perspective

    - by trond-arne.undheim
    Over the last few months, two of the European Standardization Organizations (ESOs), CEN and CENELEC have circulated an unfortunate position statement distorting the facts around fora and consortia. For the benefit of outsiders to this debate, let's just say that this debate regards whether and how the EU should recognize standards and specifications from certain fora and consortia based on a process evaluating the openness and transparency of such deliverables. The topic is complex, and somewhat confusing even to insiders, but nevertheless crucial to the European economy. As far as I can judge, their positions are not based on facts. This is unfortunate. For the benefit of clarity, here are some of the observations they make: a)"Most consortia are in essence driven by technology companies making hardware and software solutions, by definition very few of the largest ones are European-based". b) "Most consortia lack a European presence, relevant Committees, even those that are often cited as having stronger links with Europe, seem to lack an overall, inclusive set of participants". c) "Recognising specific consortia specifications will not resolve any concrete problems of interoperability for public authorities; interoperability depends on stringing together a range of specifications (from formal global bodies or consortia alike)". d) "Consortia already have the option to have their specifications adopted by the international formal standards bodies and many more exercise this than the two that seem to be campaigning for European recognition. Such specifications can then also be adopted as European standards." e) "Consortium specifications completely lack any process to take due and balanced account of requirements at national level - this is not important for technologies but can be a critical issue when discussing cross-border issues within the EU such as eGovernment, eHealth and so on". f) "The proposed recognition will not lead to standstill on national or European activities, nor to the adoption of the specifications as national standards in the CEN and CENELEC members (usually in their official national languages), nor to withdrawal of conflicting national standards. A big asset of the European standardization system is its coherence and lack of fragmentation." g) "We always miss concrete and specific examples of where consortia referencing are supposed to be helpful." First of all, note that ETSI, the third ESO, did not join the position. The reason is, of course, that ETSI beyond being an ESO, also has a global perspective and, moreover, does consider reality. Secondly, having produced arguments a) to g), CEN/CENELEC has the audacity to call a meeting on Friday 25 February entitled "ICT standardization - improving collaboration in Europe". This sounds very nice, but they have not set the stage for constructive debate. Rather, they demonstrate a striking lack of vision and lack of perspective. I will back this up by three facts, and leave it there. 1. Since the 1980s, global industry fora and consortia, such as IETF, W3C and OASIS have emerged as world-leading ICT standards development organizations with excellent procedures for openness and transparency in all phases of standards development, ex post and ex ante. - Practically no ICT system can be built without using fora and consortia standards (FCS). - Without using FCS, neither the Internet, upon which the EU economy depends, nor EU institutions would operate. - FCS are of high relevance for achieving and promoting interoperability and driving innovation. 2. FCS are complementary to the formally recognized standards organizations including the ESOs. - No work will be taken away from the ESOs should the EU recognize certain FCS. - Each FCS would be evaluated on its merit and on the openness of the process that produced it. ESOs would, with other stakeholders, have a say. - ESOs could potentially educate and assist European stakeholders to engage more actively and constructively with FCS. - ETSI, also an ESO, seems to clearly recognize these facts. 3. Europe and its Member States have a strong voice in several of the most relevant global industry fora and consortia. - W3C: W3C was founded in 1994 by an Englishman, Sir Tim Berners-Lee, in collaboration with CERN, the European research lab. In April 1995, INRIA (Institut National de Recherche en Informatique et Automatique) in France became the first European W3C host and in 2003, ERCIM (European Research Consortium in Informatics and Mathematics), also based in France, took over the role of European W3C host from INRIA. Today, W3C has 326 Members, 40% of which are European. Government participation is also strong, and it could be increased - a development that is very much desired by W3C. Current members of the W3C Advisory Board includes Ora Lassila (Nokia) and Charles McCathie Nevile (Opera). Nokia is Finnish company, Opera is a Norwegian company. SAP's Claus von Riegen is an alumni of the same Advisory Board. - OASIS: its membership - 30% of which is European - represents the marketplace, reflecting a balance of providers, user companies, government agencies, and non-profit organizations. In particular, about 15% of OASIS members are governments or universities. Frederick Hirsch from Nokia, Claus von Riegen from SAP AG and Charles-H. Schulz from Ars Aperta are on the Board of Directors. Nokia is a Finnish company, SAP is a German company and Ars Aperta is a French company. The Chairman of the Board is Peter Brown, who is an Independent Consultant, an Austrian citizen AND an official of the European Parliament currently on long-term leave. - IETF: The oversight of its activities is by the Internet Architecture Board (IAB), since 2007 chaired by Olaf Kolkman, a Dutch national who lives in Uithoorn, NL. Kolkman is director of NLnet Labs, a foundation chartered to develop open source software and open source standards for the Internet. Other IAB members include Marcelo Bagnulo whose affiliation is the University Carlos III of Madrid, Spain as well as Hannes Tschofenig from Nokia Siemens Networks. Nokia is a Finnish company. Siemens is a German company. Nokia Siemens is a European joint venture. - Member States: At least 17 European Member States have developed Interoperability Frameworks that include FCS, according to the EU-funded National Interoperability Framework Observatory (see list and NIFO web site on IDABC). This also means they actively procure solutions using FCS, reference FCS in their policies and even in laws. Member State reps are free to engage in FCS, and many do. It would be nice if the EU adjusted to this reality. - A huge number of European nationals work in the global IT industry, on European soil or elsewhere, whether in EU registered companies or not. CEN/CENELEC lacks perspective and has engaged in an effort to twist facts that is quite striking from a publicly funded organization. I wish them all possible success with Friday's meeting but I fear all of the most important stakeholders will not be at the table. Not because they do not wish to collaborate, but because they just have been insulted. If they do show up, it would be a gracious move, almost beyond comprehension. While I do not expect CEN/CENELEC to line up perfectly in favor of fora and consortia, I think it would be to their benefit to stick to more palatable observations. Actually, I would suggest an apology, straightening out the facts. This works among friends and it works in an organizational context. Then, we can all move on. Standardization is important. Too important to ignore. Too important to distort. The European economy depends on it. We need CEN/CENELEC. It is an important organization. But CEN/CENELEC needs fora and consortia, too.

    Read the article

  • JSF f:event preRenderView is triggered by f:ajax calls and partial renders, something else?

    - by Andrew
    So we have an f:event: <f:metadata> <f:event type="preRenderView" listener="#{dashboardBacking.loadProjectListFromDB}"/> </f:metadata> Which is triggered as desired on initial page load (render). However this preRenderView event is also triggered by an ajax partial page render, which re-renders an h:panelgroup with the id projectListing, as below. <h:commandButton action="#{mrBean.addProject}" value="Create Project" title="Start a new project"> <f:ajax render="projectListing" /> </h:commandButton> I only want the dashboardBacking.loadProjectListFromDB to be called for the initial page render, but not when there is an ajax partial render. Is there a more appropriate event or method I could be using?

    Read the article

  • Where did System.Design go?

    - by Nilbert
    I am making a C# project in which I am using ScintillaNet, and it says: The referenced assembly "ScintillaNet" could not be resolved because it has a dependency on "System.Design, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" which is not in the currently targeted framework ".NETFramework,Version=v4.0,Profile=Client". Please remove references to assemblies not in the targeted framework or consider retargeting your project. I tried adding a reference to System.Design, but it doesn't exist in my list. Do I need to download it somewhere? I have MS Visual Studio 10.

    Read the article

  • Can't debug code using VS 2010 beta 2

    - by Nathan W
    This is really strange and I can't seem to figure out why it won't work. I have a C# dll that is a add on for another program, the main program is not mine or a .Net app so I am starting it with Start external program in the debugging tab and and passing my program as a command line and the program starts and loads my add on however my Visual Studio debugger doesn't step into the debugger and won't hit my break points. I checked the module window and it's not even loaded in there, I used process explorer and had a look at main program and my dll was loaded into the main app. The project is set to debug, symbols to full and still nothing. I created the project in VS 2008 and it worked fine and am now trying to get this to work in VS 2010 and no go. Anyone know what could be causing this?

    Read the article

  • Ilmerge causing dll's to open during build

    - by Niall Collins
    I am using ILMerge as a post build event to combine some dll's into a single dll. It is working and combining the dll's but have this weird issue. As the project builds, the dll's are opened (only external dll's, not project dll's)! And the build wont only progress when I close the application that opens the dll, in this case I have set reflector as the default application for opening dll's. The post build event command I am using is: "..............\External\Tools\ILMerge\2.10.0\ILMerge" /out:"$(ProjectDir)$(OutDir)Combined.dll" "$(TargetPath)" "$(ProjectDir)$(OutDir)Core.dll" "$(ProjectDir)$(OutDir)Resolver.dll" "$(ProjectDir)$(OutDir)AjaxMin.dll" "$(ProjectDir)$(OutDir)Yahoo.Yui.Compressor.dll" "$(ProjectDir)$(OutDir)EcmaScript.NET.modified.dll" Anyone have issues with this?

    Read the article

< Previous Page | 428 429 430 431 432 433 434 435 436 437 438 439  | Next Page >