Search Results

Search found 18409 results on 737 pages for 'large projects'.

Page 81/737 | < Previous Page | 77 78 79 80 81 82 83 84 85 86 87 88  | Next Page >

  • How do I make TODO comments show up in the task list for C++ projects in Visual Studio 2010?

    - by Chris Simmons
    I'm trying to get my TODO comments to show up in the task list in Visual Studio 2010 for a C++ project, but they don't. I looked at this, but see no caveats other than the TODO comments need to be in the currently-open file. For example, creating a new Win32 console app places this: // TODO: reference additional headers your program requires here in a new file, stdafx.h. However, there's nothing in the task list. I have "Comments" chosen from the drop-down in the task list, but it's always empty. And it's not this problem; I can open the file and be looking at the TODO comment in the code editor and no task is shown. This is not a problem for C# projects as TODO comments show up as designed in those projects; this appears to be an issue specific to C++ projects. What else can I check?

    Read the article

  • Can't find compiled resource bundles

    - by user351032
    I am using Adobe Flash Builder 4. I've run into this issue with my latest project, but I was able to re-create it with an almost empty project. Here is what I've done. Created a new Flex Project Created a locale/en_US folder within this project. Added a class that extends SparkDownloadProgressBar. All this class does is attempt to create a Label. When I try to debug this application, I get the following error. Error: Could not find compiled resource bundle 'components' for locale 'en_US'. at mx.resources::ResourceManagerImpl/installCompiledResourceBundle()[E:\dev\4.0.0\frameworks\projects\framework\src\mx\resources\ResourceManagerImpl.as:340] at mx.resources::ResourceManagerImpl/installCompiledResourceBundles()[E:\dev\4.0.0\frameworks\projects\framework\src\mx\resources\ResourceManagerImpl.as:269] at mx.resources::ResourceManagerImpl/processInfo()[E:\dev\4.0.0\frameworks\projects\framework\src\mx\resources\ResourceManagerImpl.as:387] at mx.resources::ResourceManagerImpl()[E:\dev\4.0.0\frameworks\projects\framework\src\mx\resources\ResourceManagerImpl.as:122] at mx.resources::ResourceManager$/getInstance()[E:\dev\4.0.0\frameworks\projects\framework\src\mx\resources\ResourceManager.as:111] at mx.core::UIComponent()[E:\dev\4.0.0\frameworks\projects\framework\src\mx\core\UIComponent.as:3728] at spark.components.supportClasses::TextBase()[E:\dev\4.0.0\frameworks\projects\spark\src\spark\components\supportClasses\TextBase.as:154] at spark.components::Label()[E:\dev\4.0.0\frameworks\projects\spark\src\spark\components\Label.as:384] at Preloader()[C:\SVN\Games\Social\Test\src\Preloader.as:21] at mx.preloaders::Preloader/initialize()[E:\dev\4.0.0\frameworks\projects\framework\src\mx\preloaders\Preloader.as:253] at mx.managers::SystemManager/http://www.adobe.com/2006/flex/mx/internal::initialize()[E:\dev\4.0.0\frameworks\projects\framework\src\mx\managers\SystemManager.as:1925] at mx.managers::SystemManager/initHandler()[E:\dev\4.0.0\frameworks\projects\framework\src\mx\managers\SystemManager.as:2419] The Flex Compiler/Additional Compiler Arguments section does contain "-locale en_US", but I do not want to just remove this as I am planning to have this load different property files based on the localization region at run-time and how I understand it, I will need to add each locale that I am planning to use on the compile argument line. I am at a loss as to how to attack this problem. If you need anymore information from me to help with this, I will be more than happy to provide it. Thanks ahead of time for the help!

    Read the article

  • How to express inter project dependencies in Eclipse PDE

    - by Roland Tepp
    I am looking for the best practice of handling inter project dependencies between mixed project types where some of the projects are eclipse plug-in/OSGI bundle projects (an RCP application) and others are just plain old java projects (web services modules). Few of the eclipse plug-ins have dependencies on Java projects. My problem is that at least as far as I've looked, there is no way of cleanly expressing such a dependency in Eclipse PDE environment. I can have plug-in projects depend on other plug-in projects (via Import-Package or Require-Bundle manifest headers), but not of the plain java projects. I seem to be able to have project declare a dependency on a jar from another project in a workspace, but these jar files do not get picked up by neither export nor launch configuration (although, java code editing sees the libraries just fine). The "Java projects" are used for building services to be deployed on an J2EE container (JBoss 4.2.2 for the moment) and produce in some cases multiple jar's - one for deploying to the JBoss ear and another for use by client code (an RCP application). The way we've "solved" this problem for now is that we have 2 more external tools launcher configurations - one for building all the jar's and another for copying these jar's to the plug-in projects. This works (sort of), but the "whole build" and "copy jars" targets incur quite a large build step, bypassing the whole eclipse incremental build feature and by copying the jars instead of just referencing the projects I am decoupling the dependency information and requesting quite a massive workspace refresh that eats up the development time like it was candy. What I would like to have is a much more "natural" workspace setup that would manage dependencies between projects and request incremental rebuilds only as they are needed, be able to use client code from service libraries in an RCP application plug-ins and be able to launch the RCP application with all the necessary classes where they are needed. So can I have my cake and eat it too ;) NOTE To be clear, this is not so much about dependency management and module management at the moment as it is about Eclipse PDE configuration. I am well aware of products like [Maven], [Ivy] and [Buckminster] and they solve a quite different problem (once I've solved the workspace configuration issue, these products can actually come in handy for materializing the workspace and building the product)

    Read the article

  • TFS Solution build cascading to several other builds even when common components were not modified

    - by Bob Palmer
    Hey all, here is the issue I am currently trying to work through. We are using Team Foundation Server 2008, and utilizing the automated build support out of the box. We have one very large project that encompasses a number of interrelated components and web sites, each of which is set up as a Visual Studio solution file. Many of these solutions are highly interrelated since they may contain applications, or contain common libraries or shared components. We have roughly 20 or so applications, three large web sites, and about 20 components. Each solution may include projects from other solutions. For example, a solution for a console app would also include the project files for all of the components it utilizes, since we need to ensure that when someone changes a component and rebuilds it, it is reflected in all of the projects that consume that component, and we can make sure nothing was broken. We have build projects for each solution, whether that's an application, component, or web site. For this example, we will call them solutions 01, 02, and 03. These reference multiple projects (both their own core project and test projects, plus the projects relating to various components). Solution 01 has projects A, B, and C. Solution 02 has projects C, D, and E. Solution 03 has projects E, F, and G. Now, for the problem. If I modify project A, the system will end up rebuilding all three solutions. Worse, all thirty solutions reference common projects used for data access (let's call it project H). Because they all share one project in common, if I modify any solution in my stack, even if it does not touch project H, I still end up kicking off every single build script. Any thoughts on how to address this? Ideally I would only want to kick off builds where their constituant projects were directly modified - i.e in the example below, if I modified project C, I would only rebuild solutions 01 and 02. Thanks!

    Read the article

  • Any Recommendations for a Web Based Large File Transfer System?

    - by Glen Richards
    I'm looking for a server software product that: Allows my users to share large files with: The general public securely to 1 or more people (notification via email, optionally with a token that gives them x period of time to download) Allows anyone in the general public to share files with my users. Perhaps by invitation. Has to be user friendly enough to allow my users to use this with out having to bug me as the admin. It needs to be a system that we can install on our own server (we don't want shared data sitting on anyone else's server) A web based solution. Using some kind or secure comms channel would be good too, eg, ssh Files to share could be over 1 GB. I found the question below. WebDav does not sound user friendly enough: http://serverfault.com/questions/86878/recommendations-for-a-secure-and-simple-dropbox-system I've done a lot of searching, but I can't get the search terms right. There are too many services that provide this, but I want something we can install on our own server. A last resort would be to roll my own. Any ideas appreciated. Glen EDIT Sorry Tom and Jeff but Glen specifically says that he's looking for a 'product' so given that I specialise in this field thought that my expertise in this area may have been of use to him. I don't see how him writing services is going to be easy for him to maintain going forward (large IT admin overhead) or simple for his users and the general public to work with.

    Read the article

  • What is a good client for handling large amounts of mail ?

    - by ldigas
    Although the title sums it up nice, I'll repeat and explain. What would be a good email client for handling large amounts of mail ? Large portion of mails I receive come with attachments (zip, rar, pdf, dwg, etc.) and within a month I usually have another 1,5-2Gb of new mail. I've noticed that 'standard' Outlook Express (with whose interface I've been very happy) gets awfully slow after a while. Archiving helps but not much. Then I usually take the files, move them onto a dvd, delete all messages I can do without and start anew. The thing is, I would love to have them all in email client since I often go after some old mails (slow projects). So, what would be good alternatives ? If it is portable, that would also be nice, but I can also live without it. post scriptum: I love @gmail, but cannot use it for work. I know I could theoretically forward all of it there, and back, but that approach doesn't make my boss very happy (email handling policies and similar).

    Read the article

  • Any Recommendations for a Web Based Large File Transfer System?

    - by Glen Richards
    I'm looking for a server software product that: Allows my users to share large files with: The general public securely to 1 or more people (notification via email, optionally with a token that gives them x period of time to download) Allows anyone in the general public to share files with my users. Perhaps by invitation. Has to be user friendly enough to allow my users to use this with out having to bug me as the admin. It needs to be a system that we can install on our own server (we don't want shared data sitting on anyone else's server) A web based solution. Using some kind or secure comms channel would be good too, eg, ssh Files to share could be over 1 GB. I found the question below. WebDav does not sound user friendly enough: http://serverfault.com/questions/86878/recommendations-for-a-secure-and-simple-dropbox-system I've done a lot of searching, but I can't get the search terms right. There are too many services that provide this, but I want something we can install on our own server. A last resort would be to roll my own. Any ideas appreciated. Glen EDIT Sorry Tom and Jeff but Glen specifically says that he's looking for a 'product' so given that I specialise in this field thought that my expertise in this area may have been of use to him. I don't see how him writing services is going to be easy for him to maintain going forward (large IT admin overhead) or simple for his users and the general public to work with.

    Read the article

  • Is it possible to download extremely large files intelligently or in parts via SSH from Linux to Windows?

    - by Andrew
    I have a ~35 GB file on a remote Linux Ubuntu server. Locally, I am running Windows XP, so I am connecting to the remote Linux server using SSH (specifically, I am using a Windows program called SSH Secure Shell Client version 3.3.2). Although my broadband internet connection is quite good, my download of the large file often fails with a Connection Lost error message. I am not sure, but I think that it fails because perhaps my internet connection goes out for a second or two every several hours. Since the file is so large, downloading it may take 4.5 to 5 hours, and perhaps the internet connection goes out for a second or two during that long time. I think this because I have successfully downloaded files of this size using the same internet connection and the same SSH software on the same computer. In other words, sometimes I get lucky and the download finishes before the internet connection drops for a second. Is there any way that I can download the file in an intelligent way -- whereby the operating system or software "knows" where it left off and can resume from the last point if a break in the internet connection occurs? Perhaps it is possible to download the file in sections? Although I do not know if I can conveniently split my file into multiple files -- I think this would be very difficult, since the file is binary and is not human-readable. As it is now, if the entire ~35 GB file download doesn't finish before the break in the connection, then I have to start the download over and overwrite the ~5-20 GB chunk that was downloaded locally so far. Do you have any advice? Thanks.

    Read the article

  • Help to bypass password for sample projects of excel.

    - by Munna
    Hello Friend.. I am doing project in EXCEL VBA. i am taking refernce of some projects made in excel vba. But when I trid to open that excel sample projects, it ask me for password. so how can bypass password for that excel project so i can take refrence so that i can take refrence of sample. Please help....

    Read the article

  • How to download large files when the download size is restricted ?

    - by Rahul
    ? In my office, the network admin has restricted the download limit to a size of 1.8MB for any file. This is for sub ordinates accounts only. But for my manager's PC, there are no restrictions. Is there any way to download files from my PC by using my managers' ip address. I just tried using his ip on my pc but, had the same problem. ? Earlier I was given access to our Linux server from my pc using putty. Then I used to download large files on to the server and then transfer from server to my machine using fire ftp. This transfer worked perfectly fine. But, now I don't have any access to the server. So can I be able to download large files using fire ftp from my own PC ? I'm using Windows XP machine. Please suggest a solution by any possible combination. Thanks.

    Read the article

  • How to build many projects with the same svn revision number with hudson?

    - by tangens
    I'm just starting with hudson and I'd like to build my projects like my handmade solution did before: Retrieve the current svn revision number rev. Build all projects (each with individual result) with the revision number rev. Start again with step 1 regardless if there were any changes in the meantime (to detect nondeterministic errors that don't occur on every test run). How can I configure this with hudson?

    Read the article

  • How to read large number of rows efficiently using Zend_Db?

    - by Alex N.
    Is there a simple :) and efficient way or reading very large number of rows sequentially using Zend_Db? Basically I need to process entire table, row by row. Table is large, primary key sequence is not guaranteed(i.e. not an autoincrement, but is UNSIGNED INT). What's the best way to approach this? Environment: PHP 5.2, Zend Framework 1.10, MySQL 5.1

    Read the article

  • How do I create statistics to make ‘small’ objects appear ‘large’ to the Optmizer?

    - by Maria Colgan
    I recently spoke with a customer who has a development environment that is a tiny fraction of the size of their production environment. His team has been tasked with identifying problem SQL statements in this development environment before new code is released into production. The problem is the objects in the development environment are so small, the execution plans selected in the development environment rarely reflects what actually happens in production. To ensure the development environment accurately reflects production, in the eyes of the Optimizer, the statistics used in the development environment must be the same as the statistics used in production. This can be achieved by exporting the statistics from production and import them into the development environment. Even though the underlying objects are a fraction of the size of production, the Optimizer will see them as the same size and treat them the same way as it would in production. Below are the necessary steps to achieve this in their environment. I am using the SH sample schema as the application schema who's statistics we want to move from production to development. Step 1. Create a staging table, in the production environment, where the statistics can be stored Step 2. Export the statistics for the application schema, from the data dictionary in production, into the staging table Step 3. Create an Oracle directory on the production system where the export of the staging table will reside and grant the SH user the necessary privileges on it. Step 4. Export the staging table from production using data pump export Step 5. Copy the dump file containing the stating table from production to development Step 6. Create an Oracle directory on the development system where the export of the staging table resides and grant the SH user the necessary privileges on it.  Step 7. Import the staging table into the development environment using data pump import Step 8. Import the statistics from the staging table into the dictionary in the development environment. You can get a copy of the script I used to generate this post here. +Maria Colgan

    Read the article

  • How to produce assets effectively on large Flash game projects?

    - by Antoine Lassauzay
    I have been working on Flash games professionally for two years now and somehow, having our artists producing assets the right way is one of our biggest challenge. More precisely, it is very hard to have them following any kind of structure and/or standards, nor taking into consideration performance. I would say also the most of our issues concerns UI and related animations. Our current workflow is (on a Facebook hidden object game) : Artists produce PSD and animate prototypes in Flash Artists re-organize their FLA files to be a bit more "programmer friendly" Programmers retouches assets until they have the right structure and export classes inside a SWC, from Flash Programmers try to improve performances, sometimes degrading the quality of game graphics Our main idea is to hire somebody dedicated to prepare assets for programmers but I am really looking forward to improving the pipeline. I was wondering if you guys have tips of any kind to improve this workflow, whether it be team organization, training, tools or tips with Flash. Any explanation on your asset pipeline is well appreciated too.

    Read the article

  • Why do some large companies use a different domain for sending emails?

    - by Andrei Rinea
    I've received notifications and newsletters from Microsoft and Facebook in the past and noticed that both emails came not from an address such as [email protected] or [email protected]. Not event [email protected] but both had different domains such as : [email protected] and [email protected] Why is this? Any particular advantage in doing so? Other than not polluting the employees email software, I can't see.

    Read the article

  • Should I group all of my .js files into one large bundle?

    - by Scottie
    One of the difficulties I'm running into with my current project is that the previous developer spaghetti'd the javascript code in lots of different files. We have modal dialogs that are reused in different places and I find that the same .js file is often loaded twice. My thinking is that I'd like to just load all of the .js files in _Layout.cshtml, and that way I know it's loaded once and only once. Also, the client should only have to download this file once as well. It should be cached and therefore shouldn't really be a performance hit, except for the first page load. I should probably note that I am using ASP.Net bundling as well and loading most of the jQuery/bootstrap/etc from CDN's. Is there anything else that I'm not thinking of that would cause problems here? Should I bundle everything into a single file?

    Read the article

< Previous Page | 77 78 79 80 81 82 83 84 85 86 87 88  | Next Page >