Search Results

Search found 7814 results on 313 pages for 'agile learning'.

Page 62/313 | < Previous Page | 58 59 60 61 62 63 64 65 66 67 68 69  | Next Page >

  • Should I dive into ASP.NET MVC or start with ASP.NET Webforms?

    - by Sahat
    I plan to pick up Silverlight in the future. Possibility of going into Microsoft WPF. Currently learning Objective-C 2.0 w/ Cocoa. I already know Pros and Cons of ASP.NET MVC vs ASP.NET Webforms. What I want to know is what would be more "efficient" for me to learn given the circumstances above? By efficient I mean learning one design pattern once and then re-using it. Objective-C I believe uses MVC approach? What about Silverlight? WPF? So what do you think? Also as a side question is it true that ASP.NET Webforms is often used by freelancers/small companies and ASP.NET MVC in large enterprises?

    Read the article

  • What programming languages do you consider indispensable in your experience?

    - by Federico Ramponi
    Each programming language comes with its concepts, best practices, libraries, tools, community, in one word: culture. Learning more than one programming language will make you a better programmer, for the more concepts you learn, the faster you will feel comfortable when the next language or technology will come. Mine, so far, are C, some C++, and Python, and many times I read that it would be worth learning LISP, for "the profound enlightenment experience you will have when you finally get it" (quoting Eric Raymond). My questions are: Which is the next one you would consider a good investment to learn? Of the many programming languages you have learnt and worked with, which ones do you consider to be an essential part of one's CS culture, and why? EDIT. Further question: is there any language you would sincerely advise to avoid as a waste of time? (The famous, and questionable, slatings in this letter from Dijkstra come to my mind.)

    Read the article

  • Question about C Pointers (just learning)

    - by Mike
    I am curious as to why this is an error and what the error message means. Here is some code. int *x[] = {"foo", "bar", "baz"}; int *y[] = {"foo", "bar", "baz"}; x = y; I try to compile and I get this: error: incompatible types when assigning to type ‘char [3]’ from type ‘char *’ Question #1 why is this an error? and Question #2 why are the types different? Thanks for you help.

    Read the article

  • Learning to work with audio in C++

    - by Skilldrick
    My degree was in audio engineering, but I'm fairly new to programming. I'd like to learn how to work with audio in a programming environment, partly so I can learn C++ better through interesting projects. First off, is C++ the right language for this? Is there any reason I shouldn't be using it? I've heard of Soundfile and some other libraries - what would you recommend? Finally, does anyone know of any good tutorials in this subject? I've learnt the basics of DSP - I just want to program it! EDIT: I use Windows. I'd like to play about with real-time stuff, a bit like Max/MSP but with more control.

    Read the article

  • Looking for resources for learning SharePoint Foundation on the Internet

    - by Kabeer
    Hello. While I am quite comfortable with the .Net world, I am a baby as far as SharePoint technologies are concerned. With the advent of SharePoint 2010 (Foundation especially), I'd like to directly learn the new version. However, I am not getting enough resources on the Internet that can be helpful. Besides knowing to maneuver a SharePoint site, I'd like to understand stuff under the hood from the architecture standpoint and subsequently the APIs. Can the community please guide me to the most suitable resource(s) available on the Internet?

    Read the article

  • Agile Tour 2010 : Lancement de l'édition 2010 avec l'appel à candidature d'organisateurs locaux. Vot

    Bonjour, La troisième édition de l'Agile Tour se prépare pour le troisième trimestre (septembre / octobre). 1ère étape : la définition des villes étapes. L'an passé il y avait un total de 18 villes dont 11 en France, Luxembourg, Genève, et 3 aux Canada. Cette année, le Brésil fait son apparition et quelques villes françaises sont d'ores et déjà identifiées. Les organisateurs ont lancé l'appel à candidature de villes. Citation:

    Read the article

  • Best Books for Learning to Test Software

    - by Chris
    I have read a lot of programming books, many mention testing of various kinds. I have never really gotten into the topic of testing, but I realize how extremely important it is. Anyone know of some good books that provide a thorough exploration of this topic?

    Read the article

  • Should I be using libraries if I'm trying to learn how to program?

    - by CodeJustin.com
    I have been programming "a lot" in the past few months and at first I was trying to find the "easyest" language. Fortunately I realized that it's not about the language, it's about learning HOW to code. I ran into the Stanford lectures online (programming methodology) and I watched them all (around 23 hours total) awhile ago. Then I got into Java ME and programmed about 28.47% of a mobile RPG game (only around 2k lines of code). I feel like I learned a lot from those two experiences compared to previous ones but now that I'm moving into flash/actionscript 3.0 development and I'm finding myself learning like I did when I first started with PHP. I'm not really getting whats under the hood kind of. I'm finding myself using libraries to speed up development time which doesn't seem like a bad thing BUT I personally do not know how to write the libraries myself off hand. So should I be coding everything myself or is it ok to use libraries when you don't even know how to code them?

    Read the article

  • should a student be diversifying or mastering programming languages?

    - by Max Link
    As the question states, is it better if a student diversifies or explores when learning programming languages or should they focus only on 2-3 languages and really get to know them well? Example of what I mean by diversifying: Functional -> Scheme Procedural -> C Object Oriented -> Java Dynamic or scripting -> Python Other -> C++ I have a few breaks in between semesters sometimes (up to 3 months) and I'm thinking of either learning a new language or "master" those that I know right now. Which would benefit me in the future? I know some(about 3 months of self studying each) Java, C, and C++ already . If I'm not mistaken, where I live, the industry is heavy on Java, C++, and C#.

    Read the article

  • Learning Python Basics

    - by StaticExtasy
    So I'm trying to learn python better and i've been using this website http://www.learnpython.org/ I'm on to functions right now, heres the code #Add your functions here (before the existing functions) def list_benefits(): myList = ['More organized code','More readable code','Easier code reuse','Allowing programmers to share and connect code together'] return myList def build_sentence(info): addMe = " is a benefit of functions!" for i in info: meInfo = i + addMe return meInfo def name_the_benefits_of_functions(): list_of_benefits = list_benefits() for benefit in list_of_benefits: print build_sentence(benefit) name_the_benefits_of_functions() the output being e is a benefit of functions! e is a benefit of functions! e is a benefit of functions! r is a benefit of functions! What am i missing to return the whole scentence

    Read the article

  • Kanban vs. Scrum

    - by Andrew Siemer
    Can someone with Kanban experience tell me how Kanban and Scrum differ? What are the pro's and con's of each of the different project management methodologies? Kanban seems to be getting a lot of press these days. I don't want to miss the hottest new way of tracking my teams failures (...and successes). Responses @S. Lott - What part of this article wasn't clear enough? infoq.com/articles/hiranabe-lean-agile-kanban/…. Do you have a more specific question? That is a great article but technically no it is not clear enough. That article gives a great amount of detail about kanban (and thank you for it...good read) but it does not specifically contrast Kanban vs. Scrum. That article will help someone like me make a decision but it most certainly won't help someone like my boss or in general someone less experienced! I was hoping for a quick overview of kanban pros and cons contrasted to scrum pros and cons. Thanks though! @S. Lott - Why do you say kanban vs. scrum? What leads you to conclude they are conflicting approaches? Can you make your question more specific? I don't think that they are necessarily conflicting. But they are different enough for a user to adhere to one over the other. Perhaps one fits a project or company better than the other? How would I sell one over the other when presenting a project management approach. Say I went to a company that was currently stuck in the rutt that is "water fall" - why would I sell one approach over the other?

    Read the article

  • Using Essential Use Cases to design a UI-centric Application

    - by Bruno Brant
    Hello all, I'm begging a new project (oh, how I love the fresh taste of a new project!) and we are just starting to design it. In short: The application is a UI that will enable users to model an execution flow (a Visio like drag & drop interface). So our greatest concern is usability and features that will help the users model fast and clearly the execution flow. Our established methodology makes extensive use of Use Cases in order to create a harmonious view of the application between the programmers and users. This is a business concern, really: I'd prefer to use an Agile Method with User Stories rather than User Cases, but we need to define a clear scope to sell the product to our clients. However, Use Cases have a number of flaws, most of which are related to the fact that they include technical details, like UI, etc, as can be seem here. But, since we can't use User Stories and a fully interactive design, I've decided that we compromise: I will be using Essential Use Cases in order to hide those details. Now I have another problem: it's essential (no pun intended) to have a clear description of UI interaction, so, how should I document it? In other words, how do I specify a application through the use of Essential Use Cases where the UI interaction is vital to it? I can see some alternatives: Abandon the use of Use Cases since they don't correctly represent the problem Do not include interface descriptions in the use case, but create another documentation (Story Boards) and link then to the Essential Use Cases Include UI interaction description to the Essential Use Cases, since they are part of the business rules in the perspective of the users and the application itself

    Read the article

  • Managing shared product backlog items across multiple platforms

    - by MotoSV
    I am using TFS 2012 to develop a application that will be available as a website and a mobile application, i.e. Windows Phone, Android, etc. While I've been building up a list of features for this application I've noticed that a lot of them will be available across all platforms and I'm not to sure how to manage them within a product backlog. For example, there will be an option to sign in with a Facebook account and user will be able to do this on website and mobile applications. So my though was I would create a product backlog item "Sign in with Facebook account" and assign it to an area called "Website". I would then create another backlog item, with the same title, but this time assign it to an area called "Windows Phone". Therefore my backlog would have two items, both with the same title, but different areas. The idea is I could assign the "Sign in..." backlog item for the website to one sprint and then assign the "Sign in..." backlog item for Windows Phone to another sprint. Seeing as I'm new using Agile/Scrum would this be considered a viable way of managing a product backlog?

    Read the article

  • Problems with real-valued input deep belief networks (of RBMs)

    - by Junier
    I am trying to recreate the results reported in Reducing the dimensionality of data with neural networks of autoencoding the olivetti face dataset with an adapted version of the MNIST digits matlab code, but am having some difficulty. It seems that no matter how much tweaking I do on the number of epochs, rates, or momentum the stacked RBMs are entering the fine-tuning stage with a large amount of error and consequently fail to improve much at the fine-tuning stage. I am also experiencing a similar problem on another real-valued dataset. For the first layer I am using a RBM with a smaller learning rate (as described in the paper) and with negdata = poshidstates*vishid' + repmat(visbiases,numcases,1); I'm fairly confident I am following the instructions found in the supporting material but I cannot achieve the correct errors. Is there something I am missing? See the code I'm using for real-valued visible unit RBMs below, and for the whole deep training. The rest of the code can be found here. rbmvislinear.m: epsilonw = 0.001; % Learning rate for weights epsilonvb = 0.001; % Learning rate for biases of visible units epsilonhb = 0.001; % Learning rate for biases of hidden units weightcost = 0.0002; initialmomentum = 0.5; finalmomentum = 0.9; [numcases numdims numbatches]=size(batchdata); if restart ==1, restart=0; epoch=1; % Initializing symmetric weights and biases. vishid = 0.1*randn(numdims, numhid); hidbiases = zeros(1,numhid); visbiases = zeros(1,numdims); poshidprobs = zeros(numcases,numhid); neghidprobs = zeros(numcases,numhid); posprods = zeros(numdims,numhid); negprods = zeros(numdims,numhid); vishidinc = zeros(numdims,numhid); hidbiasinc = zeros(1,numhid); visbiasinc = zeros(1,numdims); sigmainc = zeros(1,numhid); batchposhidprobs=zeros(numcases,numhid,numbatches); end for epoch = epoch:maxepoch, fprintf(1,'epoch %d\r',epoch); errsum=0; for batch = 1:numbatches, if (mod(batch,100)==0) fprintf(1,' %d ',batch); end %%%%%%%%% START POSITIVE PHASE %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% data = batchdata(:,:,batch); poshidprobs = 1./(1 + exp(-data*vishid - repmat(hidbiases,numcases,1))); batchposhidprobs(:,:,batch)=poshidprobs; posprods = data' * poshidprobs; poshidact = sum(poshidprobs); posvisact = sum(data); %%%%%%%%% END OF POSITIVE PHASE %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% poshidstates = poshidprobs > rand(numcases,numhid); %%%%%%%%% START NEGATIVE PHASE %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% negdata = poshidstates*vishid' + repmat(visbiases,numcases,1);% + randn(numcases,numdims) if not using mean neghidprobs = 1./(1 + exp(-negdata*vishid - repmat(hidbiases,numcases,1))); negprods = negdata'*neghidprobs; neghidact = sum(neghidprobs); negvisact = sum(negdata); %%%%%%%%% END OF NEGATIVE PHASE %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% err= sum(sum( (data-negdata).^2 )); errsum = err + errsum; if epoch>5, momentum=finalmomentum; else momentum=initialmomentum; end; %%%%%%%%% UPDATE WEIGHTS AND BIASES %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% vishidinc = momentum*vishidinc + ... epsilonw*( (posprods-negprods)/numcases - weightcost*vishid); visbiasinc = momentum*visbiasinc + (epsilonvb/numcases)*(posvisact-negvisact); hidbiasinc = momentum*hidbiasinc + (epsilonhb/numcases)*(poshidact-neghidact); vishid = vishid + vishidinc; visbiases = visbiases + visbiasinc; hidbiases = hidbiases + hidbiasinc; %%%%%%%%%%%%%%%% END OF UPDATES %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% end fprintf(1, '\nepoch %4i error %f \n', epoch, errsum); end dofacedeepauto.m: clear all close all maxepoch=200; %In the Science paper we use maxepoch=50, but it works just fine. numhid=2000; numpen=1000; numpen2=500; numopen=30; fprintf(1,'Pretraining a deep autoencoder. \n'); fprintf(1,'The Science paper used 50 epochs. This uses %3i \n', maxepoch); load fdata %makeFaceData; [numcases numdims numbatches]=size(batchdata); fprintf(1,'Pretraining Layer 1 with RBM: %d-%d \n',numdims,numhid); restart=1; rbmvislinear; hidrecbiases=hidbiases; save mnistvh vishid hidrecbiases visbiases; maxepoch=50; fprintf(1,'\nPretraining Layer 2 with RBM: %d-%d \n',numhid,numpen); batchdata=batchposhidprobs; numhid=numpen; restart=1; rbm; hidpen=vishid; penrecbiases=hidbiases; hidgenbiases=visbiases; save mnisthp hidpen penrecbiases hidgenbiases; fprintf(1,'\nPretraining Layer 3 with RBM: %d-%d \n',numpen,numpen2); batchdata=batchposhidprobs; numhid=numpen2; restart=1; rbm; hidpen2=vishid; penrecbiases2=hidbiases; hidgenbiases2=visbiases; save mnisthp2 hidpen2 penrecbiases2 hidgenbiases2; fprintf(1,'\nPretraining Layer 4 with RBM: %d-%d \n',numpen2,numopen); batchdata=batchposhidprobs; numhid=numopen; restart=1; rbmhidlinear; hidtop=vishid; toprecbiases=hidbiases; topgenbiases=visbiases; save mnistpo hidtop toprecbiases topgenbiases; backpropface; Thanks for your time

    Read the article

  • Problems with real-valued deep belief networks (of RBMs)

    - by Junier
    I am trying to recreate the results reported in Reducing the dimensionality of data with neural networks of autoencoding the olivetti face dataset with an adapted version of the MNIST digits matlab code, but am having some difficulty. It seems that no matter how much tweaking I do on the number of epochs, rates, or momentum the stacked RBMs are entering the fine-tuning stage with a large amount of error and consequently fail to improve much at the fine-tuning stage. I am also experiencing a similar problem on another real-valued dataset. For the first layer I am using a RBM with a smaller learning rate (as described in the paper) and with negdata = poshidstates*vishid' + repmat(visbiases,numcases,1); I'm fairly confident I am following the instructions found in the supporting material but I cannot achieve the correct errors. Is there something I am missing? See the code I'm using for real-valued visible unit RBMs below, and for the whole deep training. The rest of the code can be found here. rbmvislinear.m: epsilonw = 0.001; % Learning rate for weights epsilonvb = 0.001; % Learning rate for biases of visible units epsilonhb = 0.001; % Learning rate for biases of hidden units weightcost = 0.0002; initialmomentum = 0.5; finalmomentum = 0.9; [numcases numdims numbatches]=size(batchdata); if restart ==1, restart=0; epoch=1; % Initializing symmetric weights and biases. vishid = 0.1*randn(numdims, numhid); hidbiases = zeros(1,numhid); visbiases = zeros(1,numdims); poshidprobs = zeros(numcases,numhid); neghidprobs = zeros(numcases,numhid); posprods = zeros(numdims,numhid); negprods = zeros(numdims,numhid); vishidinc = zeros(numdims,numhid); hidbiasinc = zeros(1,numhid); visbiasinc = zeros(1,numdims); sigmainc = zeros(1,numhid); batchposhidprobs=zeros(numcases,numhid,numbatches); end for epoch = epoch:maxepoch, fprintf(1,'epoch %d\r',epoch); errsum=0; for batch = 1:numbatches, if (mod(batch,100)==0) fprintf(1,' %d ',batch); end %%%%%%%%% START POSITIVE PHASE %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% data = batchdata(:,:,batch); poshidprobs = 1./(1 + exp(-data*vishid - repmat(hidbiases,numcases,1))); batchposhidprobs(:,:,batch)=poshidprobs; posprods = data' * poshidprobs; poshidact = sum(poshidprobs); posvisact = sum(data); %%%%%%%%% END OF POSITIVE PHASE %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% poshidstates = poshidprobs > rand(numcases,numhid); %%%%%%%%% START NEGATIVE PHASE %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% negdata = poshidstates*vishid' + repmat(visbiases,numcases,1);% + randn(numcases,numdims) if not using mean neghidprobs = 1./(1 + exp(-negdata*vishid - repmat(hidbiases,numcases,1))); negprods = negdata'*neghidprobs; neghidact = sum(neghidprobs); negvisact = sum(negdata); %%%%%%%%% END OF NEGATIVE PHASE %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% err= sum(sum( (data-negdata).^2 )); errsum = err + errsum; if epoch>5, momentum=finalmomentum; else momentum=initialmomentum; end; %%%%%%%%% UPDATE WEIGHTS AND BIASES %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% vishidinc = momentum*vishidinc + ... epsilonw*( (posprods-negprods)/numcases - weightcost*vishid); visbiasinc = momentum*visbiasinc + (epsilonvb/numcases)*(posvisact-negvisact); hidbiasinc = momentum*hidbiasinc + (epsilonhb/numcases)*(poshidact-neghidact); vishid = vishid + vishidinc; visbiases = visbiases + visbiasinc; hidbiases = hidbiases + hidbiasinc; %%%%%%%%%%%%%%%% END OF UPDATES %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% end fprintf(1, '\nepoch %4i error %f \n', epoch, errsum); end dofacedeepauto.m: clear all close all maxepoch=200; %In the Science paper we use maxepoch=50, but it works just fine. numhid=2000; numpen=1000; numpen2=500; numopen=30; fprintf(1,'Pretraining a deep autoencoder. \n'); fprintf(1,'The Science paper used 50 epochs. This uses %3i \n', maxepoch); load fdata %makeFaceData; [numcases numdims numbatches]=size(batchdata); fprintf(1,'Pretraining Layer 1 with RBM: %d-%d \n',numdims,numhid); restart=1; rbmvislinear; hidrecbiases=hidbiases; save mnistvh vishid hidrecbiases visbiases; maxepoch=50; fprintf(1,'\nPretraining Layer 2 with RBM: %d-%d \n',numhid,numpen); batchdata=batchposhidprobs; numhid=numpen; restart=1; rbm; hidpen=vishid; penrecbiases=hidbiases; hidgenbiases=visbiases; save mnisthp hidpen penrecbiases hidgenbiases; fprintf(1,'\nPretraining Layer 3 with RBM: %d-%d \n',numpen,numpen2); batchdata=batchposhidprobs; numhid=numpen2; restart=1; rbm; hidpen2=vishid; penrecbiases2=hidbiases; hidgenbiases2=visbiases; save mnisthp2 hidpen2 penrecbiases2 hidgenbiases2; fprintf(1,'\nPretraining Layer 4 with RBM: %d-%d \n',numpen2,numopen); batchdata=batchposhidprobs; numhid=numopen; restart=1; rbmhidlinear; hidtop=vishid; toprecbiases=hidbiases; topgenbiases=visbiases; save mnistpo hidtop toprecbiases topgenbiases; backpropface; Thanks for your time

    Read the article

  • Any good, easy to learn from books or tutorials for learning assembly? [on hold]

    - by pythonian29033
    I've been a developer since 2009 and I've learnt a lot of languages since, but I've always wanted to understand and be able to code in the lowest level language so I can directly (or at least very close to directly) speak to machines through my code. There was a point in time when someone showed me how to do an if statement in assembly, but out of all the books that I got, I could never really understand where/how to start learning to code in assembler. any help please? I'm obsessed with learning this! PS: if you have any software suggestions, I use ubuntu and am looking to convert to backtrack soon, so it would be preferred if you could give me something that'll be easily installed on debian linux, otherwise don't sweat it, give me the name of the windows software and I'll find an equivalent myself

    Read the article

  • Help! I've learned jQuery... now I want to learn JavaScript

    - by Derek Adair
    I am a self-taught web developer/programmer. I started out about two years ago by learning how to make simple dynamic websites with HTML/CSS/PHP. Then I started dabbling with animation... Enter jQuery I've become quite proficient with jQuery over the last year and I've even started making my own plugins. I've spent most of my effort learning how to beautify websites with fancy effects and what not. Upon tackling my first full-blown application, I realized how under-developed my knowledge of JavaScript actually is. jQuery has allowed me to rely on its framework so heavily that I rarely use any interesting functions, techniques, or whatever that are 'native' to the JavaScript language. For example: I have a basic understanding of what a closure is... but I am unsure where this technique can actually benefit me. Although as I understand it, that's what my jQuery plugins do with (function ($){//plugin code here})(jQuery). I've seen many posts/blogs/whatever about memory leaks and circular references which is concerning. I'm frustrated because I can wrap my head around the basic concepts of what these are just by reading the articles, but I'm finding that the deeper I go the more I don't understand. The vocabulary alone is burdensome. Let alone how to actually use these techniques/functions/language features. I am trying to figure out what I don't know I'm looking to gather any advice, techniques, articles, books, videos, snippets, examples, potential pitfalls... really anything you have regarding application development with JavaScript/jQuery.

    Read the article

  • Fastest Method to Learn Web Design for a Developer

    - by hekevintran
    I am a Web developer and in my projects I have noticed that my weakest point is not being good at the front-end design. Relying on other designers can be annoying if they are not able to produce as quickly as I want. My perspective on HTML/CSS is that it is basically a big hack that amazingly works. There are too many CSS and browser specific bugs/quirks to learn and remember them all without spending extreme amounts of time trying to untangle everything. Is there a fast track route to getting CSS into my brain? I have looked at some CSS books, but to me they really read as long lists of how to render things correctly in IE6 and how to make corners rounded. (Seriously why does it require so many tricks to make a sharp corner round? On any platform but the Web this would be called a major oversight.) Does there exist something that does the analogous to CSS that jQuery does for JavaScript? Using jQuery you don't need to know JavaScript well to make things that work. I am not interested in learning why IE6 does things in weird ways because I don't care about supporting it at all. I am more interested in a method of learning how to use CSS to do what I want without spending hours and hours reading obscure blogs.

    Read the article

< Previous Page | 58 59 60 61 62 63 64 65 66 67 68 69  | Next Page >