Search Results

Search found 14074 results on 563 pages for 'programmers'.

Page 262/563 | < Previous Page | 258 259 260 261 262 263 264 265 266 267 268 269  | Next Page >

  • DDD and validation of aggregate root

    - by Mik378
    Suppose an aggregate root : MailConfiguration (wrapping an AddressPart object). The AddressPart object is a simple immutable value object with some fields like senderAdress, recipentAddress (to make example simple). As being an invariant object, AddressPart should logically wrap its own Validator (by the way of external a kind of AddressValidator for respecting Single Responsibility Principle) I read some articles that claimed an aggregateRoot must validate its 'children'. However, if we follow this principle, one could create an AddressPart with an uncohesive/invalid state. What are your opinion? Should I move the collaborator AddressValidator(used in constructor so in order to validate immediately the cohesion of an AddressPart) from AddressPart and assign it to aggregateRoot (MailConfiguration) ?

    Read the article

  • Seeking some advice on pursuing MS in CS from Stanford or Carnegie Mellon or Caltech

    - by avi
    What kinds of projects are given preference in top notch colleges like Stanford, Caltech, etc to get admission into MS programme in Computer Science? I have an average academic portfolio. I'm pursuing Btech from a not so popular university in India with an aggregate of 67%. I'm good at designing algorithms and possess good knowledge of core subjects but helpless with my percentage. So, I think the only way I can impress them is with my project(s). Can anyone please suggest me the kinds of projects that are given preference by such top level institutes? Could you please also suggest some good projects? My area of interest would be Artificial Intelligence or any application/software/algorithm design which could be of some help to common people. Or if you have any other random idea for my project then please share it with me. Note: Web based projects and management projects like lib management wouldn't be my priority.

    Read the article

  • What is the name of this tree?

    - by Daniel
    It has a single root and each node has 0..N ordered sub-nodes . The keys represent a distinct set of paths. Two trees can only be merged if they share a common root. It needs to support, at minimum: insert, merge, enumerate paths. For this tree: The +-------+----------------+ | | | cat cow dog + +--------+ + | | | | drinks jumps moos barks + | milk the paths would be: The cat drinks milk The cow jumps The cow moos The dog barks It's a bit like a trie. What is it?

    Read the article

  • Future-proofing myself when learning to program.

    - by Chris Bridgett
    I want to learn to program in a 'future-proof' manner, if you like. Whilst Windows dominates the desktop OS marketplace (for now), obviously there is a lot of value in learning its languages/frameworks/API's and so on - this might be subject to change as new devices emerge or Windows shoots itself in the foot (over-friendly previews of Windows 8 don't look too appealing...). Would I be right in thinking that having a solid knowledge of C/C++ for back-end logic/low level programming and the like, combined with an extremely portable language like Java for GUI's and so on, would be a good basis for software development that will prove useful on the most amount of systems? - I'm talking desktop PC's, tablets, phones.

    Read the article

  • How to show or direct a business analyst to do data modelling?

    - by AaronLS
    Our business analysts pushed hard to collect data through a spreadsheet. I am the programmer responsible for importing that data. Usually when they push hard for something like this, I never know how well it will work out until a few weeks later when I have time assigned to work on the task of programming the import of the data. I have tried to do as much as possible along the way, named ranges, data validations, etc. But I usually don't have time to take a detailed look at all the data and compare to the destination in the database to determine how well it matches up. A lot of times there will be maybe a little table of items that somehow I have to relate to something else in the database, but there are not natural or business keys present that would allow me to do so. Make the best of this, trying to write something that can compare strings and make a best guess at it and then go through the effort of creating interfaces for a user to match the imported data to the destination. I feel like if the business analyst was actually creating a data model, they would be forced to think about these relationships, and have an appreciation for the need of natural or business keys to be part of the spreadsheet for the purposes of smoothly importing the data. The closest they come to business analysis is a big flat list of fields, and that would be fine if it were like any other data dictionary and include data types+relationships, but it isn't. They are just a bunch of names. No indication of what type of data they might hold, and it is up to me to guess. When I have pushed for more detail, they say that it is just busy work. How can I explain the importance of data modelling? How can I tell them what it is and how to do it? It feels impossible, because they don't have an appreciation for its importance. They do however, usually have an interest in helping out in whatever way they can, it's just this in particular has never gotten a motivated response.

    Read the article

  • Design Patterns for these scenarios

    - by user1899749
    Please help me to find design patterns for following situations. Situation 1: how can a smart robot use Wi-Fi? Situation 2: How can a Smart robot automatically go to rechargeable unit while there is no remote signal? Situation 3: Voice recognition component (If homeowner itself at home and motion detection is off then how can Smart Robot voice recognition component will recognize those very sensitive sentences) Situation 4: Motion detection component (How can Smart Robot send video stream on cell phone while homeowner/resident driving) I am looking for the design patterns for above Situations to answer following question. if not using design patterns, then what’re the difficulties?

    Read the article

  • Frontend development, where to start?

    - by glitch
    I've been stuck on the backend for years now, working on webservices, DBs, SDKs, APIs and all that jazz, and so that's an area that I have a decent grasp of, but I've absolutely no clue about how the frontend works. What's a good place to start for a backend monkey like me who wants to learn the best practices, latest technologies and the philosophy behind modern web page design and its interaction with the server? I want to be cool like all of the other javascript kids. Thanks!

    Read the article

  • Help me get started in TDD for web development

    - by Snow_Mac
    I've done a tiny, tiny bit of TDD in building an app for a company that I interned with. We used lots of mocking and wrote lots of assert statements, after reading lots of blogs, I'm convinced that TDD is the way to go, but how do you go about TDD web applications? My main framework is Yii in PHP. My main questions are: What do you test? Models? Controllers? Views? How do you know if the output is correct? All my web apps interact with a DB, are there cavets to that and gotchas? Can I do testing in Netbeans? Can you test form elements or just strictly objects & methods?

    Read the article

  • Book about tcp, http, named pipe, shared memory, wcf and other inter-process communication protocol

    - by Samuel
    Recently, I had to create a program to send messages between two winforms executable. I used a tool with simple built-in functionalities to prevent having to figure out all the ins and outs of this vast quantity of protocols that exist. But now, I'm ready to learn more about the internals difference between each of theses protocols. I googled a couple of them but it would be greatly appreciate to have a good reference book that gives me a clean idea of how each protocol works and what are the pros and cons in a couple of context. Here is a list of nice protocols that I found: Shared memory TCP List item Named Pipe File Mapping Mailslots MSMQ (Microsoft Queue Solution) WCF I know that all of these protocols are not specific to a language, it would be nice if example could be in .net. Thank you very much.

    Read the article

  • Personal Software Process (PSP1)

    - by gentoo_drummer
    I'm trying to figure out an exercise but it doesn't really makes to much sense.. I'm not asking someone to provide the solution. just to try and analyse what needs to be done in order to solve this. I'm trying to understand which PSP 1.0 1.1 process I should use. PROBE? Or something else? I would greatly appreciate some help on this one from someone that has experience with the Personal Software Process Methodology.. Here is the question. For the reference case (“code1.c”), the following s/w metrics are provided: man-hours spent in implementation phase (per-module): 2,7 mh/file man-hours spent in testing phase (per-module): 4,3 mh/file estimated number of bugs remaining (per-module): 0,3 errors/function, 4 errors/module (remaining) Based on the corresponding values provided for the reference case, each of the following tasks focus on some s/w metrics to be estimated for the test case (“code2.c”): [25 marks] (estimated) man-hours required in implementation phase (per-module) [8 marks] (estimated) man-hours required in testing phase (per-module) [8 marks] (estimated) number of bugs remaining at the end of testing phase (per-module) [9 marks] Tasks 4 through 6 should use the data provided for the reference case within the context of Personal Software Process level-1 (PSP-1), using them as a single-point historic data log. Specifically, the same s/w metrics are to be estimated for the test case (“code2.c”), using PSP as the basic estimation model. In order to perform the above listed tasks, students are advised to consider all phases of the PSP software development process, especially at levels PSP0 and PSP1. Both cases are to be treated as separate case-studies in the context of classic s/w development.

    Read the article

  • Do ORMs enable the creation of rich domain models?

    - by Augusto
    After using Hibernate on most of my projects for about 8 years, I've landed on a company that discourages its use and wants applications to only interact with the DB through stored procedures. After doing this for a couple of weeks, I haven't been able to create a rich domain model of the application I'm starting to build, and the application just looks like a (horrible) transactional script. Some of the issues I've found are: Cannot navigate object graph as the stored procedures just load the minimum amount of data, which means that sometimes we have similar objects with different fields. One example is: we have a stored procedure to retrieve all the data from a customer, and another to retrieve account information plus a few fields from the customer. Lots of the logic ends up in helper classes, so the code becomes more structured (with entities used as old C structs). More boring scaffolding code, as there's no framework that extracts result sets from a stored procedure and puts it in an entity. My questions are: has anyone been in a similar situation and didn't agree with the store procedure approch? what did you do? Is there an actual benefit of using stored procedures? appart from the silly point of "no one can issue a drop table". Is there a way to create a rich domain using stored procedures? I know that there's the posibility of using AOP to inject DAOs/Repositories into entities to be able to navigate the object graph. I don't like this option as it's very close to voodoo.

    Read the article

  • Weeding out real agile from buzzword agile in an interview

    - by indyK1ng
    I've been interviewing for co-ops (paid internships) lately and a large number of the companies I've been interviewing with have been saying they use Scrum or some other agile methodology (scrum being the most popular). I know that there are real agile shops and there are places which say they use an agile methodology but are really doing something else and using agile as a buzzword. My question is, what are some questions I can ask in an interview which would separate these shops out?

    Read the article

  • Modelling highly specific business requirements

    - by AndyBursh
    How can one go about modelling highly specific business requirements, which have no precedent in the system? Take for example the following requirement: When a purchase order contains N lines, is over X value in total and is being recorded against project Y, an email needs to be sent to persons A and B with the details This requirement supplements other requirements surrounding purchase orders, but comes in at a much later date in response to some ongoing problem elsewhere in the business. Persons A and B are not part of any role or group in the system, and don't hold any specific responsibility; they are simply the two people the business has appointed to receive these emails in this very specific case. Projects are also data driven, so project Y has no special properties to distinguish it from any other project. The only way to identify it is to compare its identifier to a magic number. How can one go about modelling this kind of case without introducing too much additional complexity? That I can think of right now, there are a couple of options. Perform the checks and actions inline with the existing code. Here we find the correct spot in the code, check the conditions in the requirement and send the emails to hardcoded addresses. Of course this is fraught with issues. At the very least it stops working if one of these people leaves or changes their email address. At worst you have to ensure that any tests and test data are aware that additional actions are taken for a specific set of criteria. Introduce some form of events system. Here we introduce an eventing system, so that we might react to some event, and fulfil the requirement outside of the usual path of execution. This sounds like a cleaner solution than option 1, but the work involved is ultimately probably slightly overkill for this one small requirement. That said, having it in place does allow the system to handle these kinds of specific requirements consistently and easily in the future. Are there any other (good/better) ways of handling highly specific requirements? I mean other than telling the other parts of the business no!

    Read the article

  • can you have too many dto/bo - mapping method

    - by Fredou
    I have a windows service, 2 web services and a web interface that need to follow the same path (data wise). So I came up with two ways of creating my solution. My concern is the fact that the UI/WS/etc will have their own kind of DTO (let's say the model in ASP.Net MVC) that should be mapped to a DTO so the SL can then map it to a BO then mapping it to the proper EF6 DTO so that I can save it in a database. So I'm thinking of doing it this way to remove one level of mapping. Which one should I take? Or is there a 3rd solution?

    Read the article

  • COM INTEROP Support - which is better? C# or VB

    - by dot
    I keep hearing that c# is "better" than vb... but as far as I can see, aside from syntactical differences, both compile down to the same IL. I've found some good articles by googling that explain what the differences are between the two and so I feel comfortable in "diffusing" conversations between developers arguing over vb / c#. =) But I did read an article that said vb.net 2005 had better support for com interop stuff. But i'm wondering if this is still the case? This is of interest to me because we are in the middle of redesigning an old vb6 app that communicates with some older COM components. Does anyone have recent experience with .NET and COM interop? Thanks.

    Read the article

  • Producer-consumer pattern with consumer restrictions

    - by Dan
    I have a processing problem that I am thinking is a classic producer-consumer problem with the two added wrinkles that there may be a variable number of producers and there is the restriction that no more than one item per producer may be consumed at any one time. I will generally have 50-100 producers and as many consumers as CPU cores on the server. I want to maximize the throughput of the consumers while ensuring that there are never more than one work item in process from any single producer. This is more complicated than the classic producer-consumer problem which I think assumes a single producer and no restriction on which work items may be in progress at any one time. I think the problem of multiple producers is relatively easily solved by enqueuing all work items on a single work queue protected by a critical section. I think the restriction on simultaneously processing work items from any single producer is harder because I cannot think of any solution that does not require each consumer to notify some kind of work dispatcher that a particular work item has been completed so as to lift the restriction on work items from that producer. In other words, if Consumer2 has just completed WorkItem42 from Producer53, there needs to be some kind of callback or notification from Consumer2 to a work dispatcher to allow the work dispatcher to release the next work item from Producer53 to the next available consumer (whether Consumer2 or otherwise). Am I overlooking something simple here? Is there a known pattern for this problem? I would appreciate any pointers.

    Read the article

  • C: What is a good source to teach standard/basic code conventions to someone newly learning the language ?

    - by shan23
    I'm tutoring someone who can be described as a rank newcomer in C. Understandably, she does not know much about coding conventions generally practiced, and hence all her programs tend to use single letter vars, mismatched spacing/indentation and the like, making it very difficult to read/debug her endeavors. My question is, is there a link/set of guidelines and examples which she can use for adopting basic code conventions ? It should not be too arcane as to scare her off, yet inclusive enough to have the basics covered (so that no one woulc wince looking at the code). Any suggestions ?

    Read the article

  • Copy-and-Pasted Test Code: How Bad is This?

    - by joshin4colours
    My current job is mostly writing GUI test code for various applications that we work on. However, I find that I tend to copy and paste a lot of code within tests. The reason for this is that the areas I'm testing tend to be similar enough to need repetition but not quite similar enough to encapsulate code into methods or objects. I find that when I try to use classes or methods more extensively, tests become more cumbersome to maintain and sometimes outright difficult to write in the first place. Instead, I usually copy a big chunk of test code from one section and paste it to another, and make any minor changes I need. I don't use more structured ways of coding, such as using more OO-principles or functions. Do other coders feel this way when writing test code? Obviously I want to follow DRY and YAGNI principles, but I find that test code (automated test code for GUI testing anyway) can make these principles tough to follow. Or do I just need more coding practice and a better overall system of doing things? EDIT: The tool I'm using is SilkTest, which is in a proprietary language called 4Test. As well, these tests are mostly for Windows desktop applications, but I also have tested web apps using this setup as well.

    Read the article

  • Can I get a C++ Compiler to instantiate objects at compile time

    - by gam3
    I am writing some code that has a very large number of reasonably simple objects and I would like them the be created at compile time. I would think that a compiler would be able to do this, but I have not been able to figure out how. In C I could do the the following: #include <stdio.h> typedef struct data_s { int a; int b; char *c; } info; info list[] = { 1, 2, "a", 3, 4, "b", }; main() { int i; for (i = 0; i < sizeof(list)/sizeof(*list); i++) { printf("%d %s\n", i, list[i].c); } } Using #C++* each object has it constructor called rather than just being layed out in memory. #include <iostream> using std::cout; using std::endl; class Info { const int a; const int b; const char *c; public: Info(const int, const int, const char *); const int get_a() { return a; }; const int get_b() { return b; }; const char *get_c() const { return c; }; }; Info::Info(const int a, const int b, const char *c) : a(a), b(b), c(c) {}; Info list[] = { Info(1, 2, "a"), Info(3, 4, "b"), }; main() { for (int i = 0; i < sizeof(list)/sizeof(*list); i++) { cout << i << " " << list[i].get_c() << endl; } } I just don't see what information is not available for the compiler to completely instantiate these objects at compile time, so I assume I am missing something.

    Read the article

  • Algorithm to infer tag hierarchy

    - by Tom
    I'm looking for an algorithm to infer a hierarchy from a set of tagged items. E.g. if the following items have the tags: 1 a 2 a,b 3 a,c 4 a,c,e 5 a,b 6 a,c 7 d 8 d,f Then I can construct an undirected graph (or graphs) by tallying the node weights and edge weights: node weights edge weights a 6 a-b 2 b 2 a-c 3 c 3 c-e 1 d 2 a-e 1 <-- this edge is parallel to a-c and c-e and not wanted e 1 d-f 1 f 1 The first problem is how to drop any redundant edges to get to the simplified graph? Note that it's only appropriate to remove that redundant a-e edge in this case because something is tagged as a-c-e, if that wasn't the case and the tag was a-e, that edge would have to remain. I suspect that means the removal of edges can only happen during the construction of the graph, not after everything has been tallied up. What I'd then like to do is identify the direction of the edges to create a directed graph (or graphs) and pick out root nodes to hopefully create a tree (or trees): trees a d // \\ | b c f \ e It seems like it could be a string algorithm - longest common subsequences/prefixes - or a tree/graph algorithm, but I am a little stuck since I don't know the correct terminology to search for it.

    Read the article

  • "UML is the worst thing to ever happen to MDD." Why?

    - by Florents
    William Cook in a tweet wrote that: "UML is the worst thing to ever happen to MDD. Fortunately many people now realize this ..." I would like to know the reasoning behind that claim (apparently, I'm not referring to his personal opinion). I've noticed that many people out there don't like UML that much. Also it is worth mentioning that he is in academia, where UML is preety much the holy grail of effective design and modelling.

    Read the article

  • Calculating 3d rotation around random axis

    - by mitim
    This is actually a solved problem, but I want to understand why my original method didn't work (hoping someone with more knowledge can explain). (Keep in mind, I've not very experienced in 3d programming, having only played with the very basic for a little bit...nor do I have a lot of mathematical experience in this area). I wanted to animate a point rotating around another point at a random axis, say a 45 degrees along the y axis (think of an electron around a nucleus). I know how to rotate using the transform matrix along the X, Y and Z axis, but not an arbitrary (45 degree) axis. Eventually after some research I found a suggestion: Rotate the point by -45 degrees around the Z so that it is aligned. Then rotate by some increment along the Y axis, then rotate it back +45 degrees for every frame tick. While this certainly worked, I felt that it seemed to be more work then needed (too many method calls, math, etc) and would probably be pretty slow at runtime with many points to deal with. I thought maybe it was possible to combine all the rotation matrixes involve into 1 rotation matrix and use that as a single operation. Something like: [ cos(-45) -sin(-45) 0] [ sin(-45) cos(-45) 0] rotate by -45 along Z [ 0 0 1] multiply by [ cos(2) 0 -sin(2)] [ 0 1 0 ] rotate by 2 degrees (my increment) along Y [ sin(2) 0 cos(2)] then multiply that result by (in that order) [ cos(45) -sin(45) 0] [ sin(45) cos(45) 0] rotate by 45 along Z [ 0 0 1] I get 1 mess of a matrix of numbers (since I was working with unknowns and 2 angles), but I felt like it should work. It did not and I found a solution on wiki using a different matirx, but that is something else. I'm not sure if maybe I made an error in multiplying, but my question is: this is actually a viable way to solve the problem, to take all the separate transformations, combine them via multiplying, then use that or not?

    Read the article

  • Dealing with an Idiot [closed]

    - by inspectorG4dget
    I'm a 4th year University Computer Science student, and I have this problem, that I don't seem to be able to find a straight answer to: As a 4th year computer science student, I spend more time in the computer lab on campus, than even my own home. This means that getting along with everyone else here is very important to me. In most cases, this is not an issue because my interactions with almost all the people here fall into one of the following categories: Let me help you, junior Hi fellow student in a course I'm taking, I'm having trouble with this assignment question. Can you give me a hint as to how you solved it? Hi fellow student in a course I'm taking, This is how I solved the problem that you're stuck on. Hope it helps Hi fellow student, I noticed that you're working on a project, using a library that I'm interested in. Can we setup a time so I can learn about this library from you? This model of interaction works very well for me. However, there is one fellow student, who manages to make my life hell beyond all of this (his name is not important, let's just call him "Sam"). He seems to be always (pardon my crass description) high and completely unwilling to contribute to a constructive, academic conversation. He's a pretty smart guy, but just comes across as (I hate to say something like this about a fellow student, but) an imbecile. He also has ignorant opinions on important topics, some of which pertain to my specialization (AI, NLP, etc), and when I try to explain to him why he's wrong, all he does is insult me and put me in a foul mood. I have tried ignoring him (sitting somewhere else in the lab, headphones, etc), but he seems to like doing this because he approaches me and no amount of "leave me alone" seems to do the trick. Can anyone please suggest to me how to deal with this man in a civil way? Thank you

    Read the article

  • What is the best HTML specification to be used as of Q1 2011?

    - by Rob McKinnon
    While developing a web application, what is the best spec to use? HTML4.01 HTML5 XHTML trans XHTML1.1 I was taught to use XHTML1.0 strict in uni and to avoid applet/iframe/tables(except in forms). I noticed that some deprecated tags are available in HTML5. Is it safe to code in HTML5? If so should I use target='', and the aforementioned tags? I have noticed that there are many alternatives to choose from including canvas, object. I have no preference, although Iframe tags are being dispensed from sources like Facebook/Google/etc. What would be the best avenue to take for Spec as of now(Feb 2011)?

    Read the article

  • How much time takes to a new language like D to become popular? [closed]

    - by Adrián Pérez
    I was reading about new languages for me to learn and I find very good comments about D, like it's the new C or what C++ should have been. Knowing that many people say wonders about the language, I'm wondering how much time usually takes to a language to become popular. This is, having libraries ported or written natively for this language and being used in serious software development. I have read about the history of Java, and Python to figure it out, but may be they are too high level complexity to say their development could take the same time as will take for D.

    Read the article

< Previous Page | 258 259 260 261 262 263 264 265 266 267 268 269  | Next Page >