Search Results

Search found 18450 results on 738 pages for 'mobile programming'.

Page 14/738 | < Previous Page | 10 11 12 13 14 15 16 17 18 19 20 21  | Next Page >

  • Walmart's Mobile Self-Checkout

    - by David Dorf
    Reuters recently reported that Walmart was testing an iPhone-based self-checkout at a store near its headquarters.  Consumers scan items as they're placed in the physical basket, then the virtual basket is transferred to an existing self-checkout station where payment is tendered.  A very solid solution, but not exactly original. Before we go further, let's look at the possible cost savings for Walmart.  According to the article: Pushing more shoppers to scan their own items and make payments without the help of a cashier could save Wal-Mart millions of dollars, Chief Financial Officer Charles Holley said on March 7. The company spends about $12 million in cashier wages every second at its Walmart U.S. stores. Um, yeah. Using back-of-the-napkin math, I calculated Walmart's cashiers are making $157k per hour.  A more accurate statement would be saving $12M per year for each second saved on the average transaction time.  So if this self-checkout approach saves 2 seconds per transaction on average, Walmart would save $24M per year on labor.  Maybe.  Sometimes that savings will be used to do other tasks in the store, so it may not directly translate to less employees. When I saw this approach demonstrated in Sweden, there were a few differences, which may or may not be in Walmart's plans.  First, the consumers were identified based on their loyalty card.  In order to offset the inevitable shrink, retailers need to save on labor but also increase basket size, typically via in-aisle promotions.  As they scan items, retailers should target promos, and that's easier to do if you know some shopping history.  Last I checked, Walmart had no loyalty program. Second, at the self-checkout station consumers were randomly selected for an audit in which they must re-scan all the items just like you do at a typical self-checkout.  If you were found to be stealing, your ability to use the system can be revoked.  That's a tough one in the US, especially when the system goes wrong, either by mistake or by lying.  At least in my view, the Swedes are bit more trustworthy than the people of Walmart. So while I think the idea of mobile self-checkout has merit, perhaps its not right for Walmart.

    Read the article

  • Can aptitude for learning Programming paradigms be influenced by culture or native language's gramma

    - by DVK
    It is well known that different people have different aptitudes regarding various programming paradigms (e.g. some people have trouble learning non-procedural, especially functional languages. Some people have trouble understanding pointers - see Joel Spolsky's blog for musings on that. Some people have trouble grasping recursion). I was recently reading about a study that looked at how the grammar of someone's native language affected their speed of learning math. Can't find that article now but a quick googling found this reference. That led me to wondering whether someone's native culture or first language might affect their aptitude towards various programming paradigms. I'm more curious about positive influences - e.g. some trait that make it easier/faster for someone to learn a particular paradigm, for example native language grammar being very recursion-oriented. To be clear, I'm looking for how culture/language grammare may affect the difference between aptitude of the same person towards various paradigms as opposed to how it affects overall aptitude towards programming between different persons. Important: the only answers I'm interested in are either references to scientific studies, or personal observations from someone intimately familiar with a particular culture/language, including from their own experience. E.g. I'm not interested in your opinion of how Chinese being your first language affects anything unless you speak Chinese or worked with extremely large set of Chinese-native programmers extensively. I'm OK with your guesstimates not based on scientific studies, but please be sure to supply your reasoning about plausible causes of your observation. I'm not interested in culture-bashing (any such commends will be deleted or flagged for deletion). I'm also not particularly interested in culture-building - we all know Linus is from Finland and Tetris was written in Russia and Larry Wall is an American. Any culture/nation can produce a brilliant mind in any discipline. I'm interested in averages.

    Read the article

  • Text inside <p> shrinks on mobile devices while div does not [migrated]

    - by guisasso
    I asked this question on stack overflow, but didn't get any answers, so I'm trying here. Does anybody know whats happening here? I tested on opera, dolphin and the factory android browser. (although it seems now to be working on opera) The div doesn't change size, but the text somehow is shrunk to fit on part of a div. Anyway to prevent this? Just to be clear, I'm trying to achieve on the mobile browser the same look as the pc version. As the problem seems to be with the browsers, how can I force the text to take the full width of the div? I tried setting the p tag to 100% with no success. The div has to have that width and be aligned to the left of the page. On a Pc, as it should be: I shrunk the code as much as I could: <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml" lang="en-us"> <head> <meta content="text/html; charset=utf-8" http-equiv="Content-Type" /> <meta content="" name="keywords" /> <meta content="" name="description" /> <title></title> </head> <body> <div style="width:1000px; margin-left:auto; margin-right:auto;" > <div style="float:left; width:758px; background-color:aqua;"> <p> Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text .<br /> <br /> Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text Random text .<br /> <br /> Random text Random text Random text Random text <a href="http://www.a.com/a.html"> Random text </a> Random text Random text . </p> </div> </div> </body> </html> Thanks.

    Read the article

  • Programming languages, positional languages and natural languages

    - by Vitalij Zadneprovskij
    Some programming languages are modeled on machine code, like assembly languages. Other languages are modeled on a natural language, the English language. Others are not modeled on either machine code or natural language. Languages such as PROLOG, for example, don't follow either model. I came across this Perl module Lingua::Romana::Perligata, that allows to write programs using a syntax that is very similar to Latin. Are there programming languages that have less positional syntax? Are there other languages or modules that allow you to write in syntaxes inspired by other natural languages, like French, Hebrew or Farsi? There is a very long list on Wikipedia, but most of those projects are dead. There is a related question on StackOverflow. The answer that was accepted is "Use Google".

    Read the article

  • Visual Programming paradigms

    - by Rego
    As the number of "visual" OS's such as Android, iOS and the promised Windows 8 are becoming more popular, it does not seem to me that we programmers have new ways to code using these new technologies, due to a possible lack in new visual programming languages paradigms. I've seen several discussions about incompatibilities between the current coding development environment, and the new OS approaches from Windows 8, Android and other tablets OS's. I mean, today if we have a new tablet, it's almost a requirement for coding, to have, for instance, an external keyboard (due it seems to me it's very difficult to program using the touch screen), exactly because the coding assistance is not conceived to "write" thousands of lines of code. So, how advanced should be the "new" visual programming languages paradigms? Which characteristics these new paradigms would be required?

    Read the article

  • To maximize chances of functional programming employment

    - by Rob Agar
    Given that the future of programming is functional, at some point in the nearish future I want to be paid to code in a functional language, preferably Haskell. Assuming I have a firm grasp of the language, plus all the basic programmer attributes (good communication skills/sense of humour/hygiene etc), what should I concentrate on learning to maximize my chances? Are there any particularly sought after libraries I should know? Alternatively, would another language be a better bet, say F#? (I'm not too fussed about the kind of programming work, so long as it's reasonably interesting and reasonably well paid, and with nice people)

    Read the article

  • Design patterns and multiple programming language

    - by Eduard Florinescu
    I am referring here to the design patterns found in the GOF book. First how I see it, there are a few peculiarities to design pattern and knowing multiple language knowledge, for example in Java you really need a singleton but in Python you can do without it you write a module, I saw somewhere a wiki trying to write all GOF patterns for JavaScript and the entries where empty, I guess because it might be a daunting task. If there is someone who is using design patterns and is programming in multiple programming languages supporting the OOP paradigm and can give me a hint on how should I approach design patterns that might help me in all languages I use(Java, JavaScript, Python, Ruby): Can I write good application without knowing exactly the GOF design patterns or I might need some of them which might be crucial and if yes which one, are they alternatives to GOF for specific languages, and should a programmer or a team make its own design patterns set?

    Read the article

  • Pair programming business logic with a non-IT person

    - by user1598390
    Have you have any experience in which a non-IT person works with a programmer during the coding process? It's like pair programming, but one person is a non-IT person that knows a lot about the business, maybe a process engineer with math background who knows how things are calculated and can understand non-idiomatic, procedural code. I've found that some procedural, domain-specific languages like PL/SQL are quite understandable by non-IT engineers. These person end up being co-authors of the code and guarantee the correctness of formulas, factors etc. I've found this kind of pair programming quite productive, this kind of engineer user feel they are also "owners" and "authors" of the code and help minimize misunderstanding in the communication process. They even help design the test cases. Is this practice common ? Does it have a name ? Have you had similar experiences ?

    Read the article

  • Studies of Pair Programming on Translation Projects

    - by gmletzkojr
    I am looking for information (ie, studies, metrics, etc) for pair programming when translating a project from an "older" language to a "newer" language. In this particular case, translating means line for line translation where ever possible, and only modifying the design when absolutely necessary, not when the modification would provide improved performance. I have performed pair programming in new development, and I am well aware of the pros and cons of pairing in that environment. However, I haven't been able to find any information in this particular case. Any help is appreciated.

    Read the article

  • Is functional programming a superset of object oriented?

    - by Jimmy Hoffa
    The more functional programming I do, the more I feel like it adds an extra layer of abstraction that seems like how an onion's layer is- all encompassing of the previous layers. I don't know if this is true so going off the OOP principles I've worked with for years, can anyone explain how functional does or doesn't accurately depict any of them: Encapsulation, Abstraction, Inheritance, Polymorphism I think we can all say, yes it has encapsulation via tuples, or do tuples count technically as fact of "functional programming" or are they just a utility of the language? I know Haskell can meet the "interfaces" requirement, but again not certain if it's method is a fact of functional? I'm guessing that the fact that functors have a mathematical basis you could say those are a definite built in expectation of functional, perhaps? Please, detail how you think functional does or does not fulfill the 4 principles of OOP.

    Read the article

  • Should programming languages be strict or loose?

    - by Ralph
    In Python and JavaScript, semi-colons are optional. In PHP, quotes around array-keys are optional ($_GET[key] vs $_GET['key']), although if you omit them it will first look for a constant by that name. It also allows 2 different styles for blocks (colon, or brace delimited). I'm creating a programming language now, and I'm trying to decide how strict I should make it. There are a lot of cases where extra characters aren't really necessary and can be unambiguously interpreted due to priorities, but I'm wondering if I should still enforce them or not to encourage good programming habits. What do you think?

    Read the article

  • Programming Interview Question [duplicate]

    - by user136494
    This question already has an answer here: How to prepare yourself for programming interview questions? [duplicate] 6 answers I have an upcoming interview in a couple of days and had a question for you guys. I've heard that programming interviews have whiteboard problems where you solve a simple problem on a whiteboard. My question to you is? How many whiteboard problems do you have to solve? Is there more than 1? What are examples of whiteboard problems? Is FizzBuzz one of them? Where can I find practice problems for them? Anyone know of any good web sites?

    Read the article

  • How do you learn a new programming language?

    - by Naveen
    I am C++ developer with some good experience on it. When I try to learn a new language ( have tried Java, C#, python, perl till now) I usually pickup a book and try to read it. But the problem with this is that these books typically start with some very basic programming concepts such as loops, operators etc and it starts to get very boring soon. Also, I feel I would get only theoeritcal knowledge without any practical knowledge on writing the code. So my question is how do you tacke these situations? do you just skip the chapters if its explaining something basic? also, do you have some standard set of programs that you will try to write in every new programming language you try to learn?

    Read the article

  • Learning Asynchronous programming

    - by xenoterracide
    Asynchronous non-blocking event driven programming seems to be all the rage. I have a basic conceptual understanding of what this all means. However what I'm not sure is when and where my code can benefit from being asynchronous, or how to make blocking IO, non-blocking. I'm sure that I can simply use a library to do this, but I'm more interested in more in depth concepts, and the various ways to implement it myself. Are there any comprehensive/definitive books, or other resources on this subject (like GoF for Design Patterns, or K&R for C, tldp for things like bash)? (Note: I'm not sure if this is actually functionally an identical question to my question on Learning event driven programming)

    Read the article

  • Mythbusters- Programming/hacking myths [closed]

    - by stephen776
    Hey guys. I am a big fan of the Discovery show Mythbusters, as Im sure some of you are as well. I have always wanted them to do an episode on programming/hacking. They get a lot of their show ideas from fans so I though we could compile a list of possible myths to bust. Lets hear your ideas! (sorry if this is not appropriate, close if necessary) Edit: I am not necessarily looking for subjective "This is what I want to see" answers. I am talking more along the lines of interesting computer/programming/hacking stories that would appeal to a general audience. I do not expect them to do a show on "Whats faster i++ or i + 1".

    Read the article

  • Programming for the iPhone

    - by Bobby Alexander
    Whats the best way to get started on iPhone development if you are an expeienced C++ or C# programmer? Most books either assume you know nothing or something. What are the steps to achieve this? For eg: first learn objective C (let's say), next learn cocoa... I am interested in books/resources. I read Getting started with iPhone development from Oreilly (the missing manuals book) but that just provided an over view on the programming and concentrated more on getting your app into the app store. I need need resources that will help be start coding. Other questions: How much of objective C do you need to know? How do go ahead with learning the cocoa framework? Can I directly start on cocoa touch or do I need to know the MAC cocoa framework first? Inputs from someone who was in the same situation (Know c++/c# but no clue about mac programming/objective c/cocoa) would help greatly.

    Read the article

  • Pair programming and unit testing

    - by TheSilverBullet
    My team follows the Scrum development cycle. We have received feedback that our unit testing coverage is not very good. A team member is suggesting the addition of an external testing team to assist the core team, but I feel this will backfire in a bad way. I am thinking of suggesting pair programming approach. I have a feeling that this should help the code be more "test-worthy" and soon the team can move to test driven development! What are the potential problems that might arise out of pair programming??

    Read the article

  • Studying parallel programming

    - by mort
    I'm currently finishing my Bachelor's degree in Computer Science and thinking a lot about which specialisation to choose in my Master's degree. One subject I'm particularly interested in is parallel programming. However, this topic does not seem to be a standard topic in Computer Science degrees, although it is something that is used more and more - new processors nowadays are usually dual or quad cores. So I was wandering: does anybody know a good study program in this field? I was mostly looking for it at universities in Germany, but they tend to combine the application side with some type of engineering or natural science. Thus, programs are more the "Computational Engineering" or "Computational Science" type, but I'm more interested in the Computer Science part of it, i.e. parallel programming, languages and compilers, algorithms and hardware.

    Read the article

  • Pronunciation in programming?

    - by Xepoch
    How do you correctly or erroneously pronounce programming terms? Any that you find need strict correction or history into the early CS culture? Programming char = "tchar" not care? ! = bang not exclamation? # = pound not hash? Exception #! = shebang * = splat not star? regex = "rej ex" not "regg ex"? sql = "s q l" not "sequel" (already answered, just i.e.) Unixen | = pipe not vertical bar? bin = bin as in pin , not as in binary? lib = lib as in library , not as in liberate? etc = "ett see" , not "e t c" (as in /etc and not "&c") Annoyance / = slash not backslash LaTeX = "laytek" not "lay teks"

    Read the article

  • Pair programming remotely with Visual Studio?

    - by shamp00
    What tools exist to facilitate pair programming with Visual Studio when the programmers are not in the same physical location? At the moment we are thinking voice (Skype?) plus remote desktop (VNC? TeamViewer?), but it would be good to know of other suggestions and experiences. Also, is there anything more integrated with Visual Studio? A bit more background: we are two experienced developers with who have collaborated well for a long time on a large mature project (ASP.NET, Windows Forms and SQL Server). However we are not usually working on the same part of the code base at the same time. We intend to spend some weeks doing substantial refactoring and it would be ideal if we were able to do this work with a pair-programming approach.

    Read the article

  • Harmful temptations in programming

    - by gaearon
    Just curious, what kinds of temptations in programming turned out to be really harmful in your projects? Like when you really feel the urge to do something and you believe it's going to benefit the project or else you just trick yourself into believing it is, and after a week you realize you haven't solved any real problems but instead created new ones or, in the best case, pleased your inner beast with no visible impact. Personally, I find it very hard to not refactor bad code. I work with a lot of bad legacy code, and it takes some deep breaths to not touch it when I have no tests to prove my refactoring doesn't not break anything. Another demon for me in user interface, I can literally spend hours changing UI layout just because I enjoy doing it. Sometimes I tell myself I'm working on usability, but the truth is just I love moving buttons around. What are your programming demons, and how do you avoid them?

    Read the article

  • Assembly as a First Programming Language?

    - by Anto
    How good of an idea do you think it would be to teach people Assembly (some variant) as a first programming language? It would take a lot more effort than learning for instance Java or Python, but one would have good understanding of the machine more or less from "programming day one" (compared to many higher level languages, at least). What do you think? Is it a realistic idea, at least to those who are ready to make the extra effort? Advantages and disadvantages? Note: I'm no teacher, just curious

    Read the article

  • Programming curricula

    - by davidk01
    There are a lot of schools that teach Java and C++ but whenever I see the syllabus for one of these classes it's almost always some cut and dry OO stuff with possibly some boring end of class project. With all the little gadgets and emulators for those gadgets why aren't more schools re-purposing those classes so that the students work their way up to building android or meego applications? That way students get to experience first hand what it takes to engineer/build a piece of software instead of doing finger exercises with syntax. Practically every self-taught programmer that I know started programming because they wanted to make their gadgets do things for them. They didn't learn a programming language with an abstract conception of using it on some far distant project so I don't understand why schools don't emulate this style of teaching.

    Read the article

  • Reflective practice in programming using keystroke playback

    - by Graham
    I'm thinking of applying Reflective Practice to improving my programming skills. To that end, I want to be able to watch myself writing code. In general, what is a good method for applying Reflective Practice to the craft of programming? In particular, if it's a good idea, is there an editor that records keystrokes then plays them back at a later time - possibly running the keys together without delays, or replaying at a 2x/4x/8x accelerated rate? Screencasting with RecordMyDesktop is an option, but has downsides of waiting for encoding and ending up with a big video file instead of a list of keystrokes.

    Read the article

  • Does the D programming language have a future?

    - by user32756
    I stumbled several times over D and really asked myself why it isn't more popular. D is a systems programming language. Its focus is on combining the power and high performance of C and C++ with the programmer productivity of modern languages like Ruby and Python. Special attention is given to the needs of quality assurance, documentation, management, portability and reliability. Do you think it has got a future? I really would like to try it but somehow the thought that I'm the only person on earth programming D discourages me to try it.

    Read the article

< Previous Page | 10 11 12 13 14 15 16 17 18 19 20 21  | Next Page >