Search Results

Search found 16554 results on 663 pages for 'programmers identity'.

Page 40/663 | < Previous Page | 36 37 38 39 40 41 42 43 44 45 46 47  | Next Page >

  • How can I learn to effectively write Pythonic code?

    - by Matt Fenwick
    I'm tired of getting downvoted and/or semi-rude comments on my Python answers, saying things like "this isn't Pythonic" or "that's not the Python way of doing things". To clarify, I'm not tired of getting corrected and downvoted, and I'm not tired of being wrong: I'm tired of feeling like there's a whole field of Python that I know nothing about, and seems to be implicit knowledge of experienced Python programmers. Doing a google search for "Pythonic" reveals a wide range of interpretations. The wikipedia page says: A common neologism in the Python community is pythonic, which can have a wide range of meanings related to program style. To say that code is pythonic is to say that it uses Python idioms well, that it is natural or shows fluency in the language. Likewise, to say of an interface or language feature that it is pythonic is to say that it works well with Python idioms, that its use meshes well with the rest of the language. It also discusses the term "unpythonic": In contrast, a mark of unpythonic code is that it attempts to write C++ (or Lisp, Perl, or Java) code in Python—that is, provides a rough transcription rather than an idiomatic translation of forms from another language. The concept of pythonicity is tightly bound to Python's minimalist philosophy of readability and avoiding the "there's more than one way to do it" approach. Unreadable code or incomprehensible idioms are unpythonic. I suspect one way to learn the Pythonic way is just to program in Python a whole bunch. But I bet I could write a bunch of crap and not improve that much without some guidance, whereas a good resource might speed up the learning process significantly. PEP 8 might be exactly what I'm looking for, or maybe not. I'm not sure; on the one hand it covers a lot of ground, but on the other hand, I feel like it's more suited as a reference for knowledgeable programmers than a tutorial for fresh 'uns. How do I get my foot in the Pythonic/Python way of doing things door?

    Read the article

  • Rolling your own Hackathon

    - by Terrance
    Background Info Hey, I pitched the idea of a company Hackathon that would donate our time to a charity to work on a project (for free) to improve morale in my company and increase developer cohesion. As it turns out most like the idea but, guess who's gonna be the one to put it together. lol Yeah me. I should add that we are a fairly small shop with about 10-12 programmers (some pull double duty as programmers, inters etc..) So, that might make things a bit easier. Base Question While I am no means a project manager or of any level of authority (Entry level guy) I was wondering if anyone knew the best approach for someone in my position to put together such an even with possibly (some) company backing. Or for that matter have any helpful advice to pass along to a young padawan. So far..... As of right now it is just an idea so, to start with I presumably would have to put together some sort of proposal and do some that office stuff that I became a programmer to steer clear of to some extent.

    Read the article

  • Best Practices for MVC Architecture

    - by Mystere Man
    There are a number of questions on StackOverflow regard MVC best practices, but most of those seem to revolve around things like using Dependancy Injection, or creating helper functions, or do's and don'ts of what to do in views and controllers. My question is more about how to architect an MVC application. For example, we are encouraged to use DI with the Repository pattern to decouple data access from the controller, however very little is said on HOW to do that specifically for MVC. Where would we place the Repository classes, for instance? They don't seem to be model related specifically, since the model should likewise be relatively decoupled from the actual data access technologies. A second question involves how to structure the layers or tiers. Most example applications (Nerd dinner, Music Store, etc..) all seem to use a single tier, 2 layer approach (not counting tests) that typically has controllers directly calling L2S or EF code. If I want to create a multi-tier/layer aplication what are some of the best practices there in regards to MVC? This question is one-part standard q-a, but another part best-practices, so it could go either here or programmers.se, I am marking it CW. If you feel it would be better suited to programmers.se then it can be migrated. EDIT: What happened to the Community Wiki option? It seems to be gone.

    Read the article

  • How can I promote clean coding at my workplace?

    - by Michael
    I work with a lot of legacy Java and RPG code on an internal company application. As you might expect, a lot of the code is written in many different styles, and often is difficult to read because of poorly named variables, inconsistent formatting, and contradictory comments (if they're there at all). Also, a good amount of code is not robust. Many times code is pushed to production quickly by the more experienced programmers, while code by newer programmers is held back by "code reviews" that IMO are unsatisfactory. (They usually take the form of, "It works, must be ok," than a serious critique of the code.) We have a fair number of production issues, which I feel could be lessened by giving more thought to the original design and testing. I have been working for this company for about 4 months, and have been complimented on my coding style a couple of times. My manager is also a fan of cleaner coding than is the norm. Is it my place to try to push for better style and better defensive coding, or should I simply code in the best way I can, and hope that my example will help others see how cleaner, more robust code (as well as aggressive refactoring) will result in less debugging and change time?

    Read the article

  • Is there any functional difference between immutable value types and immutable reference types?

    - by Kendall Frey
    Value types are types which do not have an identity. When one variable is modified, other instances are not. Using Javascript syntax as an example, here is how a value type works. var foo = { a: 42 }; var bar = foo; bar.a = 0; // foo.a is still 42 Reference types are types which do have an identity. When one variable is modified, other instances are as well. Here is how a reference type works. var foo = { a: 42 }; var bar = foo; bar.a = 0; // foo.a is now 0 Note how the example uses mutatable objects to show the difference. If the objects were immutable, you couldn't do that, so that kind of testing for value/reference types doesn't work. Is there any functional difference between immutable value types and immutable reference types? Is there any algorithm that can tell the difference between a reference type and a value type if they are immutable? Reflection is cheating. I'm wondering this mostly out of curiosity.

    Read the article

  • What causes bad performance in consumer apps?

    - by Crashworks
    My Comcast DVR takes at least three seconds to respond to every remote control keypress, making the simple task of watching television into a frustrating button-mashing experience. My iPhone takes at least fifteen seconds to display text messages and crashes ¼ of the times I try to bring up the iPad app; simply receiving and reading an email often takes well over a minute. Even the navcom in my car has mushy and unresponsive controls, often swallowing successive inputs if I make them less than a few seconds apart. These are all fixed-hardware end-consumer appliances for which usability should be paramount, and yet they all fail at basic responsiveness and latency. Their software is just too slow. What's behind this? Is it a technical problem, or a social one? Who or what is responsible? Is it because these were all written in managed, garbage-collected languages rather than native code? Is it the individual programmers who wrote the software for these devices? In all of these cases the app developers knew exactly what hardware platform they were targeting and what its capabilities were; did they not take it into account? Is it the guy who goes around repeating "optimization is the root of all evil," did he lead them astray? Was it a mentality of "oh it's just an additional 100ms" each time until all those milliseconds add up to minutes? Is it my fault, for having bought these products in the first place? This is a subjective question, with no single answer, but I'm often frustrated to see so many answers here saying "oh, don't worry about code speed, performance doesn't matter" when clearly at some point it does matter for the end-user who gets stuck with a slow, unresponsive, awful experience. So, at what point did things go wrong for these products? What can we as programmers do to avoid inflicting this pain on our own customers?

    Read the article

  • What is the correct way to deal with similar but independent features?

    - by Koviko
    Let's say we have a feature request come in and we begin work on it, which we'll call feature-1. It introduces some new logic to the application, which we'll call logic-A and logic-B. A programmer branches from the release branch and begins work on the feature. Soon after, we get another feature request, which we'll call feature-2. It will implement logic-A and logic-C into the application. The logic A being implemented by this feature is the same logic-A as was implemented in feature-1. Let's also say that given logic-B, logic-A might be implemented slightly differently than it would have been given logic-C, and also differently given both logic-B and logic-C (eg. with only one feature, the code would be less flexible than with both). How should this situation be handled? Concrete Example (to help with any confusion in my wording) feature-1 is a feed from programmers.stackexchange.com. feature-2 is a feed from gaming.stackexchange.com. logic-A is the implementation of a feed at all (assuming the application currently has no feeds), which links to the content as well and gives related information. logic-B is that the feed's source is from programmers.stackexchange.com. Adds to logic-A that the related programming language is displayed. logic-C is that the feed's source is from gaming.stackexchange.com. Adds to logic-A that the related game's name and box art is displayed.

    Read the article

  • How much system and business analysis should a programmer be reasonably expected to do?

    - by Rahul
    In most places I have worked for, there were no formal System or Business Analysts and the programmers were expected to perform both the roles. One had to understand all the subsystems and their interdependencies inside out. Further, one was also supposed to have a thorough knowledge of the business logic of the applications and interact directly with the users to gather requirements, answer their queries etc. In my current job, for ex, I spend about 70% time doing system analysis and only 30% time programming. I consider myself a good programmer but struggle with developing a good understanding of the business rules of a complex application. Often, this creates a handicap because while I can write efficient algorithms and thread-safe code, I lose out to guys who may be average programmers but have a much better understanding of the business processes. So I want to know - How much business and systems knowledge should a programmer have ? - How does one go about getting this knowledge in an immensely complex software system (e.g. trading applications) with several interdependent business processes but poorly documented business rules.

    Read the article

  • Practical considerations for HTML / CSS naming conventions (syntax)

    - by Jeroen
    Question: what are the practical considerations for the syntax in class and id values? Note that I'm not asking about the semantics, i.e. the actual words that are being used, as for example described in this blogpost. There are a lot of resources on that side of naming conventions already, in fact obscuring my search for practical information on the various syntactical bits: casing, use of interpunction (specifically the - dash), specific characters to use or avoid, etc. To sum up the reasons I'm asking this question: The naming restrictions on id and class don't naturally lead to any conventions The abundance of resources on the semantic side of naming conventions obscure searches on the syntactic considerations I couldn't find any authorative source on this There wasn't any question on SE Programmers yet on this topic :) Some of the conventions I've considered using: UpperCamelCase, mainly as a cross-over habit from server side coding lowerCamelCase, for consistency with JavaScript naming conventions css-style-classes, which is consistent with naming of css properties (but can be annoying when Ctrl+Shift+ArrowKey selection of text) with_under_scores, which I personally haven't seen used much alllowercase, simple to remember but can be hard to read for longer names UPPERCASEFTW, as a great way to annoy your fellow programmers (perhaps combined with option 4 for readability) And probably I've left out some important options or combinations as well. So: what considerations are there for naming conventions, and to which convention do they lead?

    Read the article

  • What is your unique programming problem-solving style? [closed]

    - by gcc
    Everyone has their own styles and technique for approaching and solving real world problems. These distinguish us from other people or other programmers. (Actually, I think it make us more desirable as programmers and improves computer science) To improve, we read a lot of books; for example, programming style, how to solve problems, how to approach problems, software and algorithms, et al. Can I learn your technique? In other words, if someone gives you a problem, at first step, what are you doing to solve it? I want learn the style in which you approach, analyze, and solve a problem. EDIT: every programmer is a unique instance; each of us approach problems and converge on solutions in our own... idiomatic manner. This manner is sometimes a quirk of training, a bias of tools, but often it is an insightful nugget, a little golden hammer that cracks nuts just slightly faster then others. When answering, give your general approaches but also take a moment to identify how you look at things in ways that your peers do not. Let's call this your Unique Solving Perspective, or USP.

    Read the article

  • Life and Career guidance

    - by Andrei TheGiant Haxtor
    Hello programmers. I have a current dilemma I'm pondering over. I will be graduating from high school with ~60 credits worth of community college work (pre-engineering courses), and I am wondering what would experienced programmers suggest I do with my time since I have all of the bull courses out of the way. Should I start taking computer science/engineering courses or should I take some other courses that interest me?(psych, math) The reason I am asking this is, well , I like doing a lot of self studying, especially relating to software and tech. I don't like to have the pressure of hard classes on me, so I could make up for the time lost doing the CC courses and dive deep in programming and books. I've started getting into programming recently unfortunately, since I didn't have much time b/c of my course load. Right now I am doing Java and messing around with android. I would like to get involved in web&mobile development, operating systems, and finance software. If any of you experienced people could please give me some guidance and words of wisdom, I would greatly appreciated. Sorry that this isn't necessarily related to programming. All the best.

    Read the article

  • How do managers know if a person is a good or a bad programmer?

    - by Pavel Shved
    In most companies that do programming teams and divisions consist of programmers who design and write code and managers who... well, do the management stuff. Aside from just not writing code, managers usually do not even look at the code the team develops, and may even have no proper IDE installed on their work machines. Still, the managers are to judge if a person works well, if he or she should be put in charge of something, or if particular developer should be assigned to a task of the most importance and responsibility. And last, but not least: the managers usually assign the quarterly bonuses! To do the above effectively, a manager should know if a person is a good programmer—among other traits, of course. The question is, how do they do it? They don't even look at the code people write, they can't directly assess the quality of the components programmers develop... but their estimates of who is a good coder, and who is "not as good" are nevertheless correct in most cases! What is the secret?

    Read the article

  • How can dev teams prevent slow performance in consumer apps?

    - by Crashworks
    When I previously asked what's responsible for slow software, a few answers I've received suggested it was a social and management problem: This isn't a technical problem, it's a marketing and management problem.... Utimately, the product mangers are responsible to write the specs for what the user is supposed to get. Lots of things can go wrong: The product manager fails to put button response in the spec ... The QA folks do a mediocre job of testing against the spec ... if the product management and QA staff are all asleep at the wheel, we programmers can't make up for that. —Bob Murphy People work on good-size apps. As they work, performance problems creep in, just like bugs. The difference is - bugs are "bad" - they cry out "find me, and fix me". Performance problems just sit there and get worse. Programmers often think "Well, my code wouldn't have a performance problem. Rather, management needs to buy me a newer/bigger/faster machine." The fact is, if developers periodically just hunt for performance problems (which is actually very easy) they could simply clean them out. —Mike Dunlavey So, if this is a social problem, what social mechanisms can an organization put into place to avoid shipping slow software to its customers?

    Read the article

  • Developing a feature which sole purpose to be taken out?

    - by adib
    What is the name of the pattern in which individual contributors (programmers/designers) developed an artifact for the sole purpose is to serve as a diversion so that management can remove that feature in the final product? This is a folklore I heard from an ex-colleague who used to work at a large game development company. At that company, it is well known that middle management is pressurized to "give inputs" and "make changes" to the product otherwise they risk being seen as not contributing to the project. This situation have delayed many projects because of these superfluous "management inputs". In one project at the above company, the artists and developers created a supernumerary animated character that appears in every cutscene and sticks out like a sore thumb. They designed it in such a way that it can be easily removed before the game is shipped (this was when games were still sold in physical media and not a downloadable product). Obviously the management then voted to remove the animation. On the positive side, management didn't introduced any unnecessary changes that would have delayed the project because they have shown that they provided constructive inputs to the product. This process pattern has a name among game programmers that work in corporates, but I forgot what was the actual name. I believe it's duck-something. Anybody can help pointing out the name and perhaps some rather credible reference to how the pattern develops?.

    Read the article

  • How many of you *really* surf around without JavaScript enabled? [closed]

    - by Stephen
    I've decided to rephrase the question. After some deliberation on Meta, I've realized that my question needs to be a bit more focused. The question: Should we (web developers) continue to spend effort progressively enhancing our web applications with JavaScript, ensuring that features gracefully degrade, thereby ensuring accessibility? Or should we spend that time focused on new features or other areas of development? The subtext of that question would be: How many of our customers/clients/users utilize our websites or applications with JavaScript disabled? Do you have any projects with requirements that specifically demand JavaScript functionality (almost all of mine do), and do those requirements also demand graceful degradation? For the sake of asking this question, I pulled up programmers.stackexchange.com without JavaScript enabled, and I was greeted with this message: "Programmers - Stack Exchange works best with JavaScript enabled". It was difficult to log in, albeit the site seemed to generally work okay. (I wasn't able to vote up any questions.) I think this is a satisfactory approach to development. Imagine the effort involved in making all of the site's features work with plain old HTML and server-side logic. OTOH, I wonder how many users have been alienated by this approach. We've all been trained (at least the good developers among us) to use progressive enhancement and to ensure our web applications' dynamic features degrade gracefully. Is this progressive enhancement just pissing into the wind, or do some of our customers actually utilize certain web services without JavaScript enabled? I mean, like really, not figuratively or presumptuously.

    Read the article

  • "Don't do programming after a few years of starting career". Is this a fair advice?

    - by Muhammad Yasir
    I am a little experienced developer having approximately 5 years experience in PHP and somewhat less in Java, C# and trying to learn some Python nowadays. Since the start of my career as a programmer I have been told every now and then by fellow programmers that programming is suitable for a few early years of a career (most of them take it as 5 years) and that one must change the direction after it. The reason they present include headaches and pressures associated with programming. They also say that programmers are less social and don't usually like to give time to their families, etc. and especially "Oh come on, you can not do programming your entire life!" I am somewhat confused here and need to ask others about it. If I leave programming then what do I do?! I guess teaching may be a good option in this case, but it will require to first earn a PhD degree perhaps. It may also be noteworthy that in my country (Pakistan) the life of a programmer is not very good in that normally they must give 2-3 extra hours in the office to accomplish urgent programming tasks. I have a sense that situation is somewhat similar in other countries and regions as well. Do you think it is fair advice to change career from programming to something else after spending 5 years in this field? UPDATE Oh wow... I never knew people can have 40+ years of experience in this field. I am both excited and amazed seeing that people are doing it since 1971... That means 15 years before my birth! It is nice to be able to talk to such experienced people, we don't get such a chance here in Pakistan.

    Read the article

  • Is paper indispensable in a programmer's everyday work?

    - by rwong
    As a programmer who work in a company whose vision is to make paperless office possible, is there any way I can work effectively while using less paper? I can list at least several kinds of papers I use quite often: Paper notebook, on which I do most of the pre-coding design work and ideas Books Temporary printouts of source code, though not so often (in color, with a 6 point font at 600 DPI) Sticky note, to remind myself of things that should be taken care of within a few days On the other hand, I also use a wiki and an office text editor. Once a while I would use a diagramming software to make a few flowcharts. Deeper questions: Is there a relationship between paper use and productivity? How can programmers help save the trees? Is paperless software development fundamentally different from paperless office? Related questions: Do you ever write code with pen and paper, and should we do it more often? What physical tools do you find useful to work as a programmer? What things are essential on a programmer's desk? Stuff every programmer needs while working Additional info, if it helps: Everyone has dual monitors. We have decent project management and issue tracking software (both web-based). Please be constructive. In particular, please give your answer to your peer programmers who wish to be flexible and are willing to change working style in order to become more productive as well as meeting certain their own personal values. Edited: I removed the company's view because it appears to be too flamebait. If you need to see my original words, go to the edit history. Deleted: Doxygen and whiteboard. Reason: disregarding my personal experience with these great tools, we never had to print out anything as a consequence of using/not using them. To see my original words, go to the edit history.

    Read the article

  • How important is graceful degradation of JavaScript? [closed]

    - by Stephen
    Should web developers continue to spend effort progressively enhancing our web applications with JavaScript, ensuring that features gracefully degrade, thereby ensuring accessibility? Or should we spend that time focused on new features or other areas of development? The subtext of that question would be: How many of our customers/clients/users utilize our websites or applications with JavaScript disabled? Do you have any projects with requirements that specifically demand JavaScript functionality (almost all of mine do), and do those requirements also demand graceful degradation? For the sake of asking this question, I pulled up programmers.stackexchange.com without JavaScript enabled, and I was greeted with this message: "Programmers - Stack Exchange works best with JavaScript enabled". It was difficult to log in, albeit the site seemed to generally work okay. (I wasn't able to vote up any questions.) I think this is a satisfactory approach to development. Imagine the effort involved in making all of the site's features work with plain old HTML and server-side logic. On the other hand, I wonder how many users have been alienated by this approach. We've all been trained (at least the good developers among us) to use progressive enhancement and to ensure our web applications' dynamic features degrade gracefully. Is this progressive enhancement just pissing into the wind, or do some of our customers actually utilize certain web services without JavaScript enabled?

    Read the article

  • "Don't do programming after a few years of starting career" Is this a fair advice?

    - by Muhammad Yasir
    I am a little experienced developer having around 5 years experience in PHP and somewhat less in Java, C# and trying to learn some Python now a days. Since the start of my career as a programmer I have been told every now and then by fellow programmers that programming is suitable for a few early years of carrier (most of them take it as 5 years) and that one must change the direction after it. The reason they present is that headaches and pressures associated with programming. They also say that programmers are less social and don't usually like to give time to their families etc. and specially "Oh come on, you can not do programming in your entire life!" I am somewhat confused here and need to ask others about it. If I leave programming then what do I do?! I guess teaching may be a good option in this case but it will require to first earn a PhD degree perhaps. It may also be noteworthy that in my country (Pakistan) the life of a programmer is not very good in that normally they must give 2-3 extra hours in office to accomplish urgent programming tasks. I have a sense that situation is somewhat similar in other countries and regions as well. So the question is, do you think it is a fair advice to change career from programming to something else after spending 5 years in this field? Thanks for sharing thoughts!

    Read the article

  • How old is "too old"?

    - by Dori
    I've been told that to be taken seriously as a job applicant, I should drop years of relevant experience off my résumé, remove the year I got my degree, or both. Or not even bother applying, because no one wants to hire programmers older than them.1 Or that I should found a company, not because I want to, or because I have a product I care about, but because that way I can get a job if/when my company is acquired. Or that I should focus more on management jobs (which I've successfully done in the past) because… well, they couldn't really explain this one, except the implication was that over a certain age you're a loser if you're still writing code. But I like writing code. Have you seen this? Is this only a local (Northern California) issue? If you've ever hired programmers:2 Of the résumés you've received, how old was the eldest applicant? What was the age of the oldest person you've interviewed? How old (when hired) was the oldest person you hired? How old is "too old" to employed as a programmer? 1 I'm assuming all applicants have equivalent applicable experience. This isn't about someone with three decades of COBOL applying for a Java guru job. 2 Yes, I know that (at least in the US) you aren't supposed to ask how old an applicant is. In my experience, though, you can get a general idea from a résumé.

    Read the article

  • Whats the greatest most impressive programing feat you ever witnessed? [closed]

    - by David Reis
    Everyone knows of the old adage that the best programmers can be orders of magnitude better than the average. I've personally seen good code and programmers, but never something so absurd. So the questions is, what is the most impressive feat of programming you ever witnessed or heard of? You can define impressive by: The scope of the task at hand e.g. John single handedly developed the framework for his company, a work comparable in scope to what the other 200 employed were doing combined. Speed e.g. Stu programmed an entire real time multi-tasking app OS on an weekened including its own C compiler and shell command line tools Complexity e.g. Jane rearchitected our entire 10 millon LOC app to work in a cluster of servers. And she did it in an afternoon. Quality e.g. Charles's code had a rate of defects per LOC 100 times lesser than the company average. Furthermore he code was clean and understandable by all. Obviously, the more of these characteristics combined, and the more extreme each of them, the more impressive is the feat. So, let me have it. What's the most absurd feat you can recount? Please provide as much detail as possible and try to avoid urban legends or exaggerations. Post only what you can actually vouch for. Bonus questions: Was the herculean task a one-of, or did the individual regularly amazed people? How do you explain such impressive performance? How was the programmer recognized for such awesome work?

    Read the article

  • J2EE or .Net Framework [closed]

    - by Kevino
    I want to learn JAVA or C#... tell me the strength and weakness of each platforms J2EE and .Net Framework today in 2012 and which is safer for the future jobs wise? I tend to prefer Java because here (Montreal, Toronto) there is like 6 Java jobs for each C# jobs and some experienced programmers advised me to go with Java because they say JVM languages are winning in the cloud and the rise of Android can't do anything except help Java in the long run. Is that true today with the release of windows 8 soon and ios devices? On the other side 1 of these programmers told me that corporation love Asp.Net Mvc3 for intranet and web dev and that tomcat/apache java jsp adoption is slowing down compared to Asp.net and ruby on rails & html5 etc. He told me too since I have a good background in system admins & networking C# would be better for me because I'll be able to do more things in the microsoft world with powershell automation and creating my own apps for all the networking stuffs (windows server, dns,dhcp, active directory, sharepoint etc). But what if windows 8 flop java and android aren't safer in the long run? because he told me mono was a joke compared to Java/android or native objective-c on ios devices. (I plan to do a full time study of 10hr's / 15hr's a day for the next 9 months of either Java or C# that's why I ask this)

    Read the article

  • how to troubleshoot sql server issues

    - by joe
    i have an ASP .net application with sql server database, and i am wondering if you can give your ideas on how to troubleshoot the following issue: i can insert / update / delete from any table, but i have one page that uses transactions to insert into different tables. the c# code is correct and very simple, but it fails. i used the sql profiler to see how my app interacts with the DB, especially that the app is using stored procedures, i can catch the exec procedure statement and run it manually from SSMS and it works fine, but the same stored procedure fails from the application!!! which lead me to think that issue is coming from the user account and settings, i am no expert in sql server and wondering if anyone can explain how to verify the required settings for user account. thanks EDIT: in web.config here is how i reference my connection <connectionStrings> <add name="Conn" connectionString="Data Source=localhost;Initial Catalog=myDB;Persist Security Info=True;User ID=DbUser;Password=password1254_3" providerName="System.Data.SqlClient"> </connectionstring> EDIT: i will try to describe the process here: 1- i begin a transaction 2- i call a stored proc to insert (which succeeds) and return the scope identity ( that will be used in the next step) 3- i call another stored procedure to insert some more info + scope identity from step 2, which is a foreign key here 4- i get error, foreign key violation 5- transaction rolled back, now tables empty again... thanks

    Read the article

  • How best to keep bumbling, non-technical managers at bay and still deliver good work?

    - by Curious
    This question may be considered subjective (I got a warning) and be closed, but I will risk it, as I need some good advice/experience on this. I read the following at the 'About' page of Fog Creek Software, the company that Joel Spolsky founded and is CEO of: Back in the year 2000, the founders of Fog Creek, Joel Spolsky and Michael Pryor, were having trouble finding a place to work where programmers had decent working conditions and got an opportunity to do great work, without bumbling, non-technical managers getting in the way. Every high tech company claimed they wanted great programmers, but they wouldn’t put their money where their mouth was. It started with the physical environment (with dozens of cubicles jammed into a noisy, dark room, where the salespeople shouting on the phone make it impossible for developers to concentrate). But it went much deeper than that. Managers, terrified of change, treated any new idea as a bizarre virus to be quarantined. Napoleon-complex junior managers insisted that things be done exactly their way or you’re fired. Corporate Furniture Police writhed in agony when anyone taped up a movie poster in their cubicle. Disorganization was so rampant that even if the ideas were good, it would have been impossible to make a product out of them. Inexperienced managers practiced hit-and-run management, issuing stern orders on exactly how to do things without sticking around to see the farcical results of their fiats. And worst of all, the MBA-types in charge thought that coding was a support function, basically a fancy form of typing. A blunt truth about most of today's big software companies! Unfortunately not every developer is as gutsy (or lucky, may I say?) as Joel Spolsky! So my question is: How best to work with such managers, keep them at bay and still deliver great work?

    Read the article

  • School vs Self-Taught [duplicate]

    - by Joan Venge
    This question already has an answer here: Do I need a degree in Computer Science to get a junior Programming job? [closed] 8 answers Do you think university is a good learning environment or is it better to be autodidact? [closed] 3 answers Do you think formal education is necessary to gain strong programming skills? There are a lot of jobs that aren't programming but involves programming, such as tech artists in games, fx tds in film for example. I see similar patterns in the people I work where the best ones I have seen were self-taught, because of being artists primarily. But I also see that while the software, programming knowledge is varied and deep, hardware knowledge is very basic, including me, again due to lack of formal education. But I also work with a lot of programmers who possess both skills in general (software and hardware). Do you think it's necessary to have a formal education to have great programming skills? Would you think less of someone if he didn't have a degree in computer science, or software engineering, etc in terms of job opportunities? Would you trust him to do a software engineering job, i.e. writing a complex tool? Basically I feel the self-taught programmer doesn't know a lot of things, i.e. not knowing a particular pattern or a particular language, etc. But I find that the ability to think outside the box much more powerful. As "pure" programmers what's your take on it?

    Read the article

< Previous Page | 36 37 38 39 40 41 42 43 44 45 46 47  | Next Page >