Search Results

Search found 88179 results on 3528 pages for 'project file'.

Page 11/3528 | < Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >

  • Project 2003 SP2 failing to install SP3

    - by Unsliced
    I have Microsoft Project Professional 2003 installed (11.2.2005.1801.15, SP2). I have been trying to open a MPP file created in a newer version so need the converter, which is part of SP3. But when I try to install the SP3 package (as downloaded from Microsoft's site) I get an error message box: --------------------------- Project 2003 Service Pack 3 (SP3) --------------------------- The expected version of the product was not found on your system. --------------------------- OK --------------------------- Project (and Office) are licensed and otherwise work correctly. Any advice?

    Read the article

  • Project not publishing start/finish date for custom TFS work items

    - by pete the pagan-gerbil
    I've got a TFS 2012 project set up with custom work items, that include Start and Finish date read-only fields (Microsoft.VSTS.Scheduling.StartDate and FinishDate). When I publish one of those custom work items from within Office Project, it does not populate those fields the same way as when I publish a Task work item (builtin TFS work item). I've looked at the transitions in the work items, and also the TFS project field mapping XML file but can't find anything that explains the difference in behaviour. What am I missing?

    Read the article

  • Android: writing a file to sdcard

    - by Sumit M Asok
    I'm trying to write a file from an Http post reply to a file on the sdcard. Everything works fine until the byte array of data is retrieved. I've tried setting WRITE_EXTERNAL_STORAGE permission in the manifest and tried many different combinations of tutorials I found on the net. All I could find was using the openFileOutput("",MODE_WORLD_READABLE) method, of the activity but how my app writes file is by using a thread. Specifically, a thread is invoked from another thread when a file has to be written, so giving an activity object didn't work even though I tried it. The app has come a long way and I cannot change how the app is currently written. Please, someone help me? CODE: File file = new File(bgdmanip.savLocation); FileOutputStream filecon = null; filecon = new FileOutputStream(file); // bgdmanip.savLocation holds the whole files path byte[] myByte; myByte = Base64Coder.decode(seReply); Log.d("myBytes", String.valueOf(myByte)); bos.write(myByte); filecon.write(myByte); myvals = x * 11024; seReply is a string reply from HttpPost response. the second set of code is looped with reference to x. the file is created but remains 0 bytes

    Read the article

  • What is the best way to track / record the current programming project you work on? [duplicate]

    - by user2424160
    This question already has an answer here: Methodology for Documenting Existing Code Base 6 answers When do you start documenting the code? 13 answers Where should a programmer explain the extended logic behind the code? 5 answers I have been in this problem for long time and I want to know how it's done in real / big companies project? Suppose I have the project to build a website. Now I divide the project into sub tasks and do it. But you know that suppose I have task1 in hand like export the page to pdf. Now I spend 3 days to do that , came across various problems, many Stack Overflow questions and in the end I solve it. Now 4 months after someone told me that there is some error in the code. Now by that I completely forgot about (60%) of how I did it and why I do this way. I document the code but I can't write the whole story of that in the code. Then I have to spend much time on code to find what was the problem so that I added this line etc. I want to know that is there any way that i can log steps in completing the project. So that I can see how I end up with code, what errors I got, what questions I asked on SO and etc. How people do it in real time? Which software to use? I know in our project management software called JIRA we have tasks but that does not cover what steps I took to solve that tasks. What is the best way so that when I look back at my 2 year old project, I know how I solve particular task?

    Read the article

  • Start a new project. Where is the inspiration ?

    - by Kakawait
    The question may seem strange at first glance ... And I may be alone in this situation. But currently I want to improve my skill through learning new framework. however, do a simple "HelloWorld" application does not satisfy me. I think a practical project is required to apprehend the basics of a framework. I know there is many scholar-like project like design a library manager or CMS. But I have recently planned to learn ~3 differents frameworks and I can't find any motivations through "unexciting" project. In the end I have not started any framework ... I'm aware of I do not develop the project of the year ... I'm just looking for something a bit exciting. I do not expect any project subject, anyway stimulation provided by a project is something subjective. But I want to know your inspiration ! (or method to learn framework ^^) I'm talking about web-dev framework but I'm open to any kind information ! Kind regards.

    Read the article

  • PHP file_put_contents File Locking

    - by hozza
    The Senario: You have a file with a string (average sentence worth) on each line. For arguments sake lets say this file is 1Mb in size (thousands of lines). You have a script that reads the file, changes some of the strings within the document (not just appending but also removing and modifying some lines) and then overwrites all the data with the new data. The Questions: Does 'the server' PHP, OS or httpd etc. already have systems in place to stop issues like this (reading/writing half way through a write)? i. If it does, please explain how it works and give examples or links to relevant documentation. ii. If not, are there things I can enable or set-up, such as locking a file until a write is completed and making all other reads and/or writes fail until the previous script has finished writing? My Assumptions and Other Information: The server in question is running PHP and Apache or Lighttpd. If the script is called by one user and is halfway through writing to the file and another user reads the file at that exact moment. The user who reads it will not get the full document, as it hasn't been written yet. (If this assumption is wrong please correct me) I'm only concerned with PHP writing and reading to a text file, and in particular, the functions "fopen"/"fwrite" and mainly "file_put_contents". I have looked at the "file_put_contents" documentation but have not found the level of detail or a good explanation of what the "LOCK_EX" flag is or does. The senario is an EXAMPLE of a worst case senario where I would assume these issues are more likely to occur, due to the large size of the file and the way the data is edited. I want to learn more about these issues and don't want or need answers or comments such as "use mysql" or "why are you doing that" because I'm not doing that, I just want to learn about file read/writing with PHP and don't seem to be looking in the right places/documentation and yes I understand PHP is not the perfect language for working with files in this way...

    Read the article

  • Multiple file descriptors to the same file, C

    - by Gigi
    I have a multithreaded application that is opening and reading the same file (not writing). I am opening a different file descriptor for each thread (but they all point to the same file). Each thread then reads the file and may close it and open it again if EOF is reached. Is this ok? If I perform fclose() on a file descriptor does it affect the other file descritptors that point to the same file?

    Read the article

  • How to create a new Team Project Collection in TFS2010:

    - by jehan
    TFS 2010 has introduced the notion of Team Project Collection (TPC).  I have already discussed about TPC in my earlier post, you can check it out here. In this post, I will demonstrate how to create a new Team Project Collection in TFS2010. First, you have to open the TFS Administration Console (Start à All Programs à Microsoft Team Foundation Server 2010 à Team Foundation Server Administration Console), expand the Application Tier node in TFS Administration Console and click on Team Project Collection. Here you will see the TPC’s which are already exist, I am having only one TPC named New Collection and I’m going to create a new TPC called Demo Collection. To create a new Team Project Collection, you need to click on Create Collection; it will open the Create New Team Project Collection window.     Under the Name tab, you have to enter the name of Collection which you want to give for your new TPC (I naming it as Demo Collection). You can also provide some description about your TPC in Description tab which is optional and click next. Here, you need to enter the name of SQL Server Instance where you want your new TPC data to reside. You have the option either to choose the creating a Database for this TPC or use the already existing empty database and then click next.   In next screen, you have to choose SharePoint configuration. Here you have the options to either configure SharePoint Site for TPC at default collections or you can specify the your existing SharePoint site and  you can also choose not  to configure the SharePoint for this collection, if you choose last option then you cannot configure the Share Point sites for the all the Team Projects under this Project Collection. You also have the flexibility to create a Share Point site for this TPC later on, then if you need you have to configure SharePoint site for the existing team projects manually.   In next screen, you will have the Reports configuration. Here you have the options to either configure the Reports for TPC at default path or you can specify the path for at existing Reports folder, you can also choose not to configure the Reports for this collection, if you choose last option then you cannot create  the Reports  for the all the Team Projects under this Project Collection. Here also you can enable reporting for this TPC later on. The next screen is related to Lab Management Configuration, Lab Management is the new feature in TFS2010 which enables the users to create and manage virtual test environments where you can deploy and test your application. There are no options available here as I don’t have the Lab Management configured for my Team Foundation Server. The next screen is Review Configuration window, which will show up all the configuration settings you have specified, so that you can review the configurations before creating the Team Project Collection. If you want to make any changes to the configurations then you can go back to the previous windows and can make the changes. After Reviewing the configuration settings, you can click on verify button. Which will verify that if you’re Team Project Collection is ready to be created or not, it will show up the errors and warning (if any) which can make your Team Project Collection fail. You can then choose to create the Team Project Collection if the verify option doesn’t throw any warnings and errors. If the verify option throws any errors, then it is strongly suggested that you have to first rectify the issues then only go for TPC creation especially in case of warnings as it is a common practice to overlook the warnings.   If you choose the create TPC option, then it will start the process of creating a Team Project Collection  and once its completed you can check the status of configuration different components  during Team Project Collection. You can see in below screen that all the components are configured successfully.   In next screen, you can find the location of log file created for this Team Project Creation, this log file is really important in case of Team Project creation failure because it will help you to find  the root cause for the failure. Now, you can see that the New Team Projection (Demo Collection) which was created is now available in Team Foundation Collection tab and its status is Online.   You can now try to connect to this Team Project Collection from Team Explorer. Choose the newly created Team Project Collection and click on connect.     This Team Project Collection is empty because no Team Projects are created yet. Now, you can create the new Team Projects and start working.

    Read the article

  • File system implementation in MongoDB with GridFS

    - by Ralph
    I am working on two projects that will both implement a Webdav server backed by a MongoDB GridFS. In each case, there is the potential for the system to store tens of millions of files spread across thousands of hierarchical directories. I can come up with two different ways of storing the directory structure: As a "true" hierarchical file system, with directories containing the IDs (_id) of subdirectories and regular files. The paths will be separated by slashes (/) as in a POSIX-compliant file system. The path /a/b/c will be represented as a directory a containing a directory b containing a file c. As a flat file system, where file names include the slashes. The path /a/b/c will be stored as a single file with the name /a/b/c What are the advantages and disadvantages of each, with respect to a "real" folder-based file system?

    Read the article

  • How to install PHP-FPM and PHP on Ubuntu?

    - by Sanoj
    I have problems with installing PHP and in Ubuntu. I followed the instructions on the PHP-FPM site, PHP FastCGI Process Manager but when doing ../configure && make to compile PHP I got a lot of not found messages (listed below), and I don't know how to fix them. I tried both the Integrated compilation and Separate compilation but both compilations ends up with the same messages. Is there a solution or workaround? An alternativ way to install PHP with PHP-FPM? ../configure: 11986: ac_fn_c_check_func: not found ../configure: 11997: ac_fn_c_check_func: not found ../configure: 12147: 5: Bad file descriptor ../configure: 12147: :: checking for socket in -lsocket: not found ../configure: 12147: 6: Bad file descriptor ../configure: 12147: checking for socket in -lsocket... : not found cat: confdefs.h: No such file or directory ../configure: 12147: ac_fn_c_try_link: not found ../configure: 12147: 5: Bad file descriptor ../configure: 12147: :: result: no: not found ../configure: 12147: 6: Bad file descriptor ../configure: 12147: no: not found ../configure: 12147: 5: Bad file descriptor ../configure: 12147: :: checking for __socket in -lsocket: not found ../configure: 12147: 6: Bad file descriptor ../configure: 12147: checking for __socket in -lsocket... : not found cat: confdefs.h: No such file or directory ../configure: 12147: ac_fn_c_try_link: not found ../configure: 12147: 5: Bad file descriptor ../configure: 12147: :: result: no: not found ../configure: 12147: 6: Bad file descriptor ../configure: 12147: no: not found ../configure: 12154: ac_fn_c_check_func: not found ../configure: 12165: ac_fn_c_check_func: not found ../configure: 12315: 5: Bad file descriptor ../configure: 12315: :: checking for socketpair in -lsocket: not found ../configure: 12315: 6: Bad file descriptor ../configure: 12315: checking for socketpair in -lsocket... : not found cat: confdefs.h: No such file or directory ../configure: 12315: ac_fn_c_try_link: not found ../configure: 12315: 5: Bad file descriptor ../configure: 12315: :: result: no: not found ../configure: 12315: 6: Bad file descriptor ../configure: 12315: no: not found ../configure: 12315: 5: Bad file descriptor ../configure: 12315: :: checking for __socketpair in -lsocket: not found ../configure: 12315: 6: Bad file descriptor ../configure: 12315: checking for __socketpair in -lsocket... : not found cat: confdefs.h: No such file or directory ../configure: 12315: ac_fn_c_try_link: not found ../configure: 12315: 5: Bad file descriptor ../configure: 12315: :: result: no: not found ../configure: 12315: 6: Bad file descriptor ../configure: 12315: no: not found ../configure: 12322: ac_fn_c_check_func: not found ../configure: 12333: ac_fn_c_check_func: not found ../configure: 12483: 5: Bad file descriptor ../configure: 12483: :: checking for htonl in -lsocket: not found ../configure: 12483: 6: Bad file descriptor ../configure: 12483: checking for htonl in -lsocket... : not found cat: confdefs.h: No such file or directory ../configure: 12483: ac_fn_c_try_link: not found ../configure: 12483: 5: Bad file descriptor ../configure: 12483: :: result: no: not found ../configure: 12483: 6: Bad file descriptor ../configure: 12483: no: not found ../configure: 12483: 5: Bad file descriptor ../configure: 12483: :: checking for __htonl in -lsocket: not found ../configure: 12483: 6: Bad file descriptor ../configure: 12483: checking for __htonl in -lsocket... : not found cat: confdefs.h: No such file or directory ../configure: 12483: ac_fn_c_try_link: not found ../configure: 12483: 5: Bad file descriptor ../configure: 12483: :: result: no: not found ../configure: 12483: 6: Bad file descriptor ../configure: 12483: no: not found ../configure: 12490: ac_fn_c_check_func: not found ../configure: 12501: ac_fn_c_check_func: not found ../configure: 12651: 5: Bad file descriptor ../configure: 12651: :: checking for gethostname in -lnsl: not found ../configure: 12651: 6: Bad file descriptor ../configure: 12651: checking for gethostname in -lnsl... : not found cat: confdefs.h: No such file or directory ../configure: 12651: ac_fn_c_try_link: not found ../configure: 12651: 5: Bad file descriptor ../configure: 12651: :: result: no: not found ../configure: 12651: 6: Bad file descriptor ../configure: 12651: no: not found ../configure: 12651: 5: Bad file descriptor ../configure: 12651: :: checking for __gethostname in -lnsl: not found ../configure: 12651: 6: Bad file descriptor ../configure: 12651: checking for __gethostname in -lnsl... : not found cat: confdefs.h: No such file or directory ../configure: 12651: ac_fn_c_try_link: not found ../configure: 12651: 5: Bad file descriptor ../configure: 12651: :: result: no: not found ../configure: 12651: 6: Bad file descriptor ../configure: 12651: no: not found ../configure: 12658: ac_fn_c_check_func: not found ../configure: 12669: ac_fn_c_check_func: not found ../configure: 12819: 5: Bad file descriptor ../configure: 12819: :: checking for gethostbyaddr in -lnsl: not found ../configure: 12819: 6: Bad file descriptor ../configure: 12819: checking for gethostbyaddr in -lnsl... : not found cat: confdefs.h: No such file or directory ../configure: 12819: ac_fn_c_try_link: not found ../configure: 12819: 5: Bad file descriptor ../configure: 12819: :: result: no: not found ../configure: 12819: 6: Bad file descriptor ../configure: 12819: no: not found ../configure: 12819: 5: Bad file descriptor ../configure: 12819: :: checking for __gethostbyaddr in -lnsl: not found ../configure: 12819: 6: Bad file descriptor ../configure: 12819: checking for __gethostbyaddr in -lnsl... : not found cat: confdefs.h: No such file or directory ../configure: 12819: ac_fn_c_try_link: not found ../configure: 12819: 5: Bad file descriptor ../configure: 12819: :: result: no: not found ../configure: 12819: 6: Bad file descriptor ../configure: 12819: no: not found ../configure: 12826: ac_fn_c_check_func: not found ../configure: 12837: ac_fn_c_check_func: not found ../configure: 12987: 5: Bad file descriptor ../configure: 12987: :: checking for yp_get_default_domain in -lnsl: not found ../configure: 12987: 6: Bad file descriptor ../configure: 12987: checking for yp_get_default_domain in -lnsl... : not found cat: confdefs.h: No such file or directory ../configure: 12987: ac_fn_c_try_link: not found ../configure: 12987: 5: Bad file descriptor ../configure: 12987: :: result: no: not found ../configure: 12987: 6: Bad file descriptor ../configure: 12987: no: not found ../configure: 12987: 5: Bad file descriptor ../configure: 12987: :: checking for __yp_get_default_domain in -lnsl: not found ../configure: 12987: 6: Bad file descriptor ../configure: 12987: checking for __yp_get_default_domain in -lnsl... : not found cat: confdefs.h: No such file or directory ../configure: 12987: ac_fn_c_try_link: not found ../configure: 12987: 5: Bad file descriptor ../configure: 12987: :: result: no: not found ../configure: 12987: 6: Bad file descriptor ../configure: 12987: no: not found ../configure: 12995: ac_fn_c_check_func: not found ../configure: 13006: ac_fn_c_check_func: not found ../configure: 13156: 5: Bad file descriptor ../configure: 13156: :: checking for dlopen in -ldl: not found ../configure: 13156: 6: Bad file descriptor ../configure: 13156: checking for dlopen in -ldl... : not found cat: confdefs.h: No such file or directory ../configure: 13156: ac_fn_c_try_link: not found ../configure: 13156: 5: Bad file descriptor ../configure: 13156: :: result: no: not found ../configure: 13156: 6: Bad file descriptor ../configure: 13156: no: not found ../configure: 13156: 5: Bad file descriptor ../configure: 13156: :: checking for __dlopen in -ldl: not found ../configure: 13156: 6: Bad file descriptor ../configure: 13156: checking for __dlopen in -ldl... : not found cat: confdefs.h: No such file or directory ../configure: 13156: ac_fn_c_try_link: not found ../configure: 13156: 5: Bad file descriptor ../configure: 13156: :: result: no: not found ../configure: 13156: 6: Bad file descriptor ../configure: 13156: no: not found ../configure: 13164: 5: Bad file descriptor ../configure: 13164: :: checking for sin in -lm: not found ../configure: 13164: 6: Bad file descriptor ../configure: 13164: checking for sin in -lm... : not found cat: confdefs.h: No such file or directory ../configure: 13196: ac_fn_c_try_link: not found ../configure: 13198: 5: Bad file descriptor ../configure: 13198: :: result: no: not found ../configure: 13198: 6: Bad file descriptor ../configure: 13198: no: not found ../configure: 13214: ac_fn_c_check_func: not found ../configure: 13225: ac_fn_c_check_func: not found ../configure: 13510: 5: Bad file descriptor ../configure: 13510: :: checking for inet_aton in -lresolv: not found ../configure: 13510: 6: Bad file descriptor ../configure: 13510: checking for inet_aton in -lresolv... : not found cat: confdefs.h: No such file or directory ../configure: 13510: ac_fn_c_try_link: not found ../configure: 13510: 5: Bad file descriptor ../configure: 13510: :: result: no: not found ../configure: 13510: 6: Bad file descriptor ../configure: 13510: no: not found ../configure: 13510: 5: Bad file descriptor ../configure: 13510: :: checking for __inet_aton in -lresolv: not found ../configure: 13510: 6: Bad file descriptor ../configure: 13510: checking for __inet_aton in -lresolv... : not found cat: confdefs.h: No such file or directory ../configure: 13510: ac_fn_c_try_link: not found ../configure: 13510: 5: Bad file descriptor ../configure: 13510: :: result: no: not found ../configure: 13510: 6: Bad file descriptor ../configure: 13510: no: not found ../configure: 13510: 5: Bad file descriptor ../configure: 13510: :: checking for inet_aton in -lbind: not found ../configure: 13510: 6: Bad file descriptor ../configure: 13510: checking for inet_aton in -lbind... : not found cat: confdefs.h: No such file or directory ../configure: 13510: ac_fn_c_try_link: not found ../configure: 13510: 5: Bad file descriptor ../configure: 13510: :: result: no: not found ../configure: 13510: 6: Bad file descriptor ../configure: 13510: no: not found ../configure: 13510: 5: Bad file descriptor ../configure: 13510: :: checking for __inet_aton in -lbind: not found ../configure: 13510: 6: Bad file descriptor ../configure: 13510: checking for __inet_aton in -lbind... : not found cat: confdefs.h: No such file or directory ../configure: 13510: ac_fn_c_try_link: not found ../configure: 13510: 5: Bad file descriptor ../configure: 13510: :: result: no: not found ../configure: 13510: 6: Bad file descriptor ../configure: 13510: no: not found ../configure: 13516: 5: Bad file descriptor ../configure: 13516: :: checking for ANSI C header files: not found ../configure: 13516: 6: Bad file descriptor ../configure: 13516: checking for ANSI C header files... : not found cat: confdefs.h: No such file or directory ../configure: 13615: ac_fn_c_try_compile: not found ../configure: 13617: 5: Bad file descriptor ../configure: 13617: :: result: no: not found ../configure: 13617: 6: Bad file descriptor ../configure: 13617: no: not found ../configure: 13665: ac_cv_header_dirent_dirent.h: not found ../configure: 13665: 5: Bad file descriptor ../configure: 13665: :: checking for dirent.h that defines DIR: not found ../configure: 13665: 6: Bad file descriptor ../configure: 13665: checking for dirent.h that defines DIR... : not found eval: 1: Bad substitution

    Read the article

  • File Sharing: User-created folders are read-only to others on Mac 10.6 Server

    - by Anriëtte Combrink
    Hi there We recently got a new Mac Mini Server with 10.6 Server on it. It has two 500GB volumes, one of which [Macintosh HD2 the extra one other than the boot disk] we are using to share our work files. I have added a user account for each user in the Users pane on Server Preferences, and all our staff (users added to the system) are added to a new group, called toolboxstaff. Now, when a user creates a new folder on this volume, folders are created with read-only access for everyone else besides the owner. How do I set it that when a user creates a folder, it creates it with RW access for the toolboxstaff group? Thanks in advance.

    Read the article

  • Convert Existing Eclipse Project to Maven Project

    - by Tom G
    For a project at work, we're considering using the Maven plugin for Eclipse to automate our builds. Right now the procedure is far more complicated than it ought to be, and we're hoping that Maven will simplify things to a one-click build. My question is, is there a wizard or automatic importer for converting an existing Eclipse Java project to a Maven project, using the Maven plugin? Or should I create a new Maven project and manually copy over all source files, libs, etc.

    Read the article

  • Project Navigation and File Nesting in ASP.NET MVC Projects

    - by Rick Strahl
    More and more I’m finding myself getting lost in the files in some of my larger Web projects. There’s so much freaking content to deal with – HTML Views, several derived CSS pages, page level CSS, script libraries, application wide scripts and page specific script files etc. etc. Thankfully I use Resharper and the Ctrl-T Go to Anything which autocompletes you to any file, type, member rapidly. Awesome except when I forget – or when I’m not quite sure of the name of what I’m looking for. Project navigation is still important. Sometimes while working on a project I seem to have 30 or more files open and trying to locate another new file to open in the solution often ends up being a mental exercise – “where did I put that thing?” It’s those little hesitations that tend to get in the way of workflow frequently. To make things worse most NuGet packages for client side frameworks and scripts, dump stuff into folders that I generally don’t use. I’ve never been a fan of the ‘Content’ folder in MVC which is just an empty layer that doesn’t serve much of a purpose. It’s usually the first thing I nuke in every MVC project. To me the project root is where the actual content for a site goes – is there really a need to add another folder to force another path into every resource you use? It’s ugly and also inefficient as it adds additional bytes to every resource link you embed into a page. Alternatives I’ve been playing around with different folder layouts recently and found that moving my cheese around has actually made project navigation much easier. In this post I show a couple of things I’ve found useful and maybe you find some of these useful as well or at least get some ideas what can be changed to provide better project flow. The first thing I’ve been doing is add a root Code folder and putting all server code into that. I’m a big fan of treating the Web project root folder as my Web root folder so all content comes from the root without unneeded nesting like the Content folder. By moving all server code out of the root tree (except for Code) the root tree becomes a lot cleaner immediately as you remove Controllers, App_Start, Models etc. and move them underneath Code. Yes this adds another folder level for server code, but it leaves only code related things in one place that’s easier to jump back and forth in. Additionally I find myself doing a lot less with server side code these days, more with client side code so I want the server code separated from that. The root folder itself then serves as the root content folder. Specifically I have the Views folder below it, as well as the Css and Scripts folders which serve to hold only common libraries and global CSS and Scripts code. These days of building SPA style application, I also tend to have an App folder there where I keep my application specific JavaScript files, as well as HTML View templates for client SPA apps like Angular. Here’s an example of what this looks like in a relatively small project: The goal is to keep things that are related together, so I don’t end up jumping around so much in the solution to get to specific project items. The Code folder may irk some of you and hark back to the days of the App_Code folder in non Web-Application projects, but these days I find myself messing with a lot less server side code and much more with client side files – HTML, CSS and JavaScript. Generally I work on a single controller at a time – once that’s open it’s open that’s typically the only server code I work with regularily. Business logic lives in another project altogether, so other than the controller and maybe ViewModels there’s not a lot of code being accessed in the Code folder. So throwing that off the root and isolating seems like an easy win. Nesting Page specific content In a lot of my existing applications that are pure server side MVC application perhaps with some JavaScript associated with them , I tend to have page level javascript and css files. For these types of pages I actually prefer the local files stored in the same folder as the parent view. So typically I have a .css and .js files with the same name as the view in the same folder. This looks something like this: In order for this to work you have to also make a configuration change inside of the /Views/web.config file, as the Views folder is blocked with the BlockViewHandler that prohibits access to content from that folder. It’s easy to fix by changing the path from * to *.cshtml or *.vbhtml so that view retrieval is blocked:<system.webServer> <handlers> <remove name="BlockViewHandler"/> <add name="BlockViewHandler" path="*.cshtml" verb="*" preCondition="integratedMode" type="System.Web.HttpNotFoundHandler" /> </handlers> </system.webServer> With this in place, from inside of your Views you can then reference those same resources like this:<link href="~/Views/Admin/QuizPrognosisItems.css" rel="stylesheet" /> and<script src="~/Views/Admin/QuizPrognosisItems.js"></script> which works fine. JavaScript and CSS files in the Views folder deploy just like the .cshtml files do and can be referenced from this folder as well. Making this happen is not really as straightforward as it should be with just Visual Studio unfortunately, as there’s no easy way to get the file nesting from the VS IDE directly (you have to modify the .csproj file). However, Mads Kristensen has a nice Visual Studio Add-in that provides file nesting via a short cut menu option. Using this you can select each of the ‘child’ files and then nest them under a parent file. In the case above I select the .js and .css files and nest them underneath the .cshtml view. I was even toying with the idea of throwing the controller.cs files into the Views folder, but that’s maybe going a little too far :-) It would work however as Visual Studio doesn’t publish .cs files and the compiler doesn’t care where the files live. There are lots of options and if you think that would make life easier it’s another option to help group related things together. Are there any downside to this? Possibly – if you’re using automated minification/packaging tools like ASP.NET Bundling or Grunt/Gulp with Uglify, it becomes a little harder to group script and css files for minification as you may end up looking in multiple folders instead of a single folder. But – again that’s a one time configuration step that’s easily handled and much less intrusive then constantly having to search for files in your project. Client Side Folders The particular project shown above in the screen shots above is a traditional server side ASP.NET MVC application with most content rendered into server side Razor pages. There’s a fair amount of client side stuff happening on these pages as well – specifically several of these pages are self contained single page Angular applications that deal with 1 or maybe 2 separate views and the layout I’ve shown above really focuses on the server side aspect where there are Razor views with related script and css resources. For applications that are more client centric and have a lot more script and HTML template based content I tend to use the same layout for the server components, but the client side code can often be broken out differently. In SPA type applications I tend to follow the App folder approach where all the application pieces that make the SPA applications end up below the App folder. Here’s what that looks like for me – here this is an AngularJs project: In this case the App folder holds both the application specific js files, and the partial HTML views that get loaded into this single SPA page application. In this particular Angular SPA application that has controllers linked to particular partial views, I prefer to keep the script files that are associated with the views – Angular Js Controllers in this case – with the actual partials. Again I like the proximity of the view with the main code associated with the view, because 90% of the UI application code that gets written is handled between these two files. This approach works well, but only if controllers are fairly closely aligned with the partials. If you have many smaller sub-controllers or lots of directives where the alignment between views and code is more segmented this approach starts falling apart and you’ll probably be better off with separate folders in js folder. Following Angular conventions you’d have controllers/directives/services etc. folders. Please note that I’m not saying any of these ways are right or wrong  – this is just what has worked for me and why! Skipping Project Navigation altogether with Resharper I’ve talked a bit about project navigation in the project tree, which is a common way to navigate and which we all use at least some of the time, but if you use a tool like Resharper – which has Ctrl-T to jump to anything, you can quickly navigate with a shortcut key and autocomplete search. Here’s what Resharper’s jump to anything looks like: Resharper’s Goto Anything box lets you type and quick search over files, classes and members of the entire solution which is a very fast and powerful way to find what you’re looking for in your project, by passing the solution explorer altogether. As long as you remember to use (which I sometimes don’t) and you know what you’re looking for it’s by far the quickest way to find things in a project. It’s a shame that this sort of a simple search interface isn’t part of the native Visual Studio IDE. Work how you like to work Ultimately it all comes down to workflow and how you like to work, and what makes *you* more productive. Following pre-defined patterns is great for consistency, as long as they don’t get in the way you work. A lot of the default folder structures in Visual Studio for ASP.NET MVC were defined when things were done differently. These days we’re dealing with a lot more diverse project content than when ASP.NET MVC was originally introduced and project organization definitely is something that can get in the way if it doesn’t fit your workflow. So take a look and see what works well and what might benefit from organizing files differently. As so many things with ASP.NET, as things evolve and tend to get more complex I’ve found that I end up fighting some of the conventions. The good news is that you don’t have to follow the conventions and you have the freedom to do just about anything that works for you. Even though what I’ve shown here diverges from conventions, I don’t think anybody would stumble over these relatively minor changes and not immediately figure out where things live, even in larger projects. But nevertheless think long and hard before breaking those conventions – if there isn’t a good reason to break them or the changes don’t provide improved workflow then it’s not worth it. Break the rules, but only if there’s a quantifiable benefit. You may not agree with how I’ve chosen to divert from the standard project structures in this article, but maybe it gives you some ideas of how you can mix things up to make your existing project flow a little nicer and make it easier to navigate for your environment. © Rick Strahl, West Wind Technologies, 2005-2014Posted in ASP.NET  MVC   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • Which software development methodologies can be seen as foundations

    - by Bas
    I'm writing a small research paper which involves software development methodologiess. I was looking into all the available methodology's and I was wondering, from all methodologies, are there any that have provided the foundations for the others? For an example, looking at the following methodologies: Agile, Prototyping, Cleanroom, Iterative, RAD, RUP, Spiral, Waterfall, XP, Lean, Scrum, V-Model, TDD. Can we say that: Prototyping, Iterative, Spiral and Waterfall are the "foundation" for the others? Or is there no such thing as "foundations" and does each methodology has it's own unique history? I would ofcourse like to describe all the methodology's in my research paper, but I simply don't have the time to do so and that is why I would like to know which methodologies can be seen as representatives.

    Read the article

  • Good practices for large scale development/delivery of software

    - by centic
    What practices do you apply when working with large teams on multiple versions of a software or multiple competing projects? What are best practices that can be used to still get the right things done first? Is there information available how big IT companies do development and management of some of their large projects, e.g. things like Oracle Database, WebSphere Application Server, Microsoft Windows, ....?

    Read the article

  • Which software development methodologies can be seen as foundations

    - by Bas
    I'm writing a small research paper which involves software development methodologiess. I was looking into all the available methodology's and I was wondering, from all methodologies, are there any that have provided the foundations for the others? For an example, looking at the following methodologies: Agile, Prototyping, Cleanroom, Iterative, RAD, RUP, Spiral, Waterfall, XP, Lean, Scrum, V-Model, TDD. Can we say that: Prototyping, Iterative, Spiral and Waterfall are the "foundation" for the others? Or is there no such thing as "foundations" and does each methodology has it's own unique history? I would ofcourse like to describe all the methodology's in my research paper, but I simply don't have the time to do so and that is why I would like to know which methodologies can be seen as representatives.

    Read the article

  • fcgiwrap listening to a unix socket file: how to change file permissions

    - by user36520
    I have a web server (nginx) and a CGI application (gitweb) that is ran with fcgiwrap to enable Fast CGI access to it. I want the Fast CGI protocol to take place over a unix socket file. To start the fcgiwrap daemon, I run: setuidgid git fcgiwrap -s "unix:$PWD/fastcgi.sock" (this is a daemontools daemon) The problem is that my web server runs as the user www-data and not the user git. And fcgiwrap creates the socket fastcgi.sock with user git, group git and read only fort the non owner. Thus, nginc with the user www-data can't access the socket. Apparently, fcgiwrap is not able to select permissions of unix socket files. And this is quite annoying. Moreover, if I manage to have the socket file exists before I run fcgiwrap (which is quite difficult given I did not find any shell command to create a socket file), it quits with the following error: Failed to bind: Address already in use The only solution I found is to start the server the following way: rm -f fastcgi.sock # Ensure that the socket doesn't already exists (sleep 5; chgrp www-data fastcgi.sock; chmod g+w fastcgi.sock) & exec setuidgid git fcgiwrap -s "unix:$PWD/fastcgi.sock" Which is far from the most elegant solution. Can you think of anything better ? Thanks

    Read the article

  • Agile development challenges

    - by Bob
    With Scrum / user story / agile development, how does one handle scheduling out-of-sync tasks that are part of a user story? We are a small gaming company working with a few remote consultants who do graphics and audio work. Typically, graphics work should be done at least a week (sometimes 2 weeks) in advance of the code so that it's ready for integration. However, since SCRUM is supposed to focus on user stories, how should I split the stories across iteration so that they still follow the user story model? Ideally, a user story should be completed by all the team members in the same iteration, I feel that splitting them in any way violates the core principle of user story driven development. Also, one front end developer can work at 2X pace of backend developers. However, that throws the scheduling out of sync as well because he is either constantly ahead of them or what we have done is to have him work on tasks that not specific to this iteration just to keep busy. Either way, it's the same issue as above, splitting up user story tasks. If someone can recommend an active Google agile development group that discusses these and other issues, that'll be great. Also, if you know of a free alternative to Pivotal Labs, let me know as well. I'm looking now at Agilo.

    Read the article

  • Nautilus and file command in 11.04 don't show metadata for WebM files

    - by Pili
    The file-name extension .webm is used for media files using the WebM multimedia format, which consists of the WebM container (a subset of the Matroska container) and audio and video streams with independet enconding and quality settings. Description of the issue: For files in the WebM format, the program file says that files are raw data, instead of determining and displaying the real file-format, which is WebM. Besides, Nautilus doesn't display the technical metadata of files in this format. Why is the file program not displaying the file format for WebM files?

    Read the article

  • Releasing Mobile Application under multiple platforms, same time or?

    - by SAFAD
    I've always had this idea circling around until I am facing this issue. We made an android app, it is ready, but we are planning to release the same app on iOS and possibly Windows Phone; Now, should we just release the Android app and promise the clients that the iOS version is coming soon (create anticipation before release) or delay the release until the iOS version is ready ? Same applies if we have a premium and free version, should we release the free version and promise that the better premium is coming soon, or release them both the same time ? Best Regards

    Read the article

  • How to add extensions to a lot of files using content of each file?

    - by v8media
    I've got over 10,000 files that don't have extensions from older versions of the Mac OS. They're extremely nested, and they also have all sorts of strange formatting and characters. They don't have file types or creator codes attached to them any longer. A great deal of these files have text in the file that will let me determine extensions (for example Word.Document.8 is in every file created by that version of Word, and Excel.Sheet.8 in every file created with that version of Excel). I found a script that looks like it would work for one of these file types at a time, but it erases parts of filenames after nefarious characters, which is not good. find . -type f -not -name "." -print0 |\ xargs -0 file |\ grep 'Word.Document.8' |\ sed 's/:.*//' |\ xargs -I % echo mv % %.doc So, two questions from that: One is, should I clean the characters in the filenames first, or programmatically deal with those in the script in order to leave them the same? As long as I lose no information from the filenames, I don't see a problem cleaning out slashes and other problem characters. Also, if I clean the filenames, there are likely to be duplicates, so any cleaning script would have to add something like "-1" before the extension to make sure nothing gets lost. 2nd question is how do I change the script so that it will look for more than one file type at the same time and give each the proper extension? I'm not tied to this script, but it is understandable, which is a pro. Mac OS X 10.6 is installed on this file server, but I've got access to any recent versions of OS X. Thanks, Ian

    Read the article

  • What are the memory-management capabilities of MySQL + JDBC (in light of autonomic computing)?

    - by Adel
    I'm interested in implementing some kind of autonomic-computing functionality using MySQL. By autonomic-computing I mean roughly some failsafe abilities, whereby the application appears to be at least slightly "intelligent" For reference, the main parts of autonomic computing we'd like are the "self-configuring" and "self-healing" features (the other two - "self-optimizing" and "self-protecting", are too abstract/futuristic for us, at this time). Sofor example, if we have a sample Java application that utilizes a MySQL database, we might want to automatically restart the MySQL database if we take up too much memory. Or maybe we want to have the ability to dynamiccally adjust the database memory as needed. So for example, when we start the application the database begins with a 56 Megabyte buffer; but then as we insert so many rows we want to have it automatically jump up to 512 MB, then to 1024, until a max of 4096 MB. Does all of the above suggest that MySQL is too "weak" for the task? Do you suggest using Oracle database? My professor believes that by using Java we can basically make up for any memory-management deficiencies that MySQL has in relation to Oracle DB. I'm new to MySQL , but have experience with Oracle. If all of the above sounds wishy-washy, it is because I'm still fleshing it out. thanks

    Read the article

  • Python file-io code listing current folder path instead of the specified

    - by Tom Brito
    I have the code: import os import sys fileList = os.listdir(sys.argv[1]) for file in fileList: if os.path.isfile(file): print "File >> " + os.path.abspath(file) else: print "Dir >> " + os.path.abspath(file) Located in my music folder ("/home/tom/Music") When I call it with: python test.py "/tmp" I expected it to list my "/tmp" files and folders with the full path. But it printed lines like: Dir >> /home/tom/Music/seahorse-gw2jNn Dir >> /home/tom/Music/FlashXX9kV847 Dir >> /home/tom/Music/pulse-DcIEoxW5h2gz This is, the correct file names, but the wrong path (and this files are not either in my Music folder).. What's wrong with this code?

    Read the article

< Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >