Search Results

Search found 8923 results on 357 pages for 'core dumped'.

Page 128/357 | < Previous Page | 124 125 126 127 128 129 130 131 132 133 134 135  | Next Page >

  • Windows Embedded in Padua

    - by Valter Minute
    Martedì 8 Giugno, presso l’università di Padova terrò un intervento su Windows CE all’interno dell’evento: “Workshop sulle nuove architetture per sistemi embedded”. All’interno dello stesso evento altre sessioni tratteranno dell’architettura ARM e in particolare dei nuovi core Cortex A8, di applicazioni avanzate e di Linux Embedded. L’agenda (fitta e interessante) e il link per le iscrizioni potete trovarli qui: http://www.arrowitaly.it/training-events/training-events/dettaglio-training/article/workshop-sulle-nuove-architetture-per-sistemi-embedded/?tx_ttnews[backPid]=1336&cHash=ae34e269e1

    Read the article

  • Thou shalt not put code on a piedestal - Code is a tool, no more, no less

    - by Ralf Westphal
    “Write great code and everything else becomes easier” is what Paul Pagel believes in. That´s his version of an adage by Brian Marick he cites: “treat code as an end, not just a means.” And he concludes: “My post-Agile world is software craftsmanship.” I wonder, if that´s really the way to go. Will “simply” writing great code lead the software industry into the light? He´s alluding to the philosopher Kant who proposed, a human beings should never be treated as a means, but always as an end. But should we transfer this ethical statement into the world of software? I doubt it.   Reason #1: Human beings are categorially different from code. They are autonomous entities who need to find a way of living happily together. To Kant it seemed this goal could only be reached if nobody (ab)used a human being for his/her purposes. Because using a human being, i.e. treating it as a means, would contradict the fundamental autonomy and freedom of human beings. People should hold up a symmetric view of their relationships: Since nobody wants to be (ab)used, nobody should (ab)use anybody else. If you want to be treated decently, with respect, in accordance with your own free will - which means as an end - then do the same to other people. Code is dead, it´s a product, it´s a tool for people to reach their goals. No company spends any money on code other than to save money or earn money in the long run. Code is not a puppy. Enterprises do not commission software development to just feel good in its company. Code is not a buddy. Code is a slave, if you will. A mechanical slave, a non-tangible robot. Code is a tool, is a tool. And if we start to treat it differently, if we elevate its status unduely… I guess that will contort our relationship in a contraproductive way. Please get me right: Just because something is “just a tool”, “just a product” does not mean we should not be careful while designing, building, using it. Right to the contrary. We should be very careful when writing code – but not for the code´s sake! We should be careful because we respect our customers who are fellow human beings who should be treated as an end. If we are careless, neglectful, ignorant when producing code on their behalf, then we´re using them. Being sloppy means you´re caring more for yourself that for your customer. You´re then treating the customer as a means to fulfill some of your own needs. That´s plain unethical behavior.   Reason #2: The focus should always be on your purpose, not on any tool. But if code is treated as an end, then the focus is on the code. That might sound right, because where else should be your focus as a software developer? But, well, I´d say, your focus should be on delivering value to your customer. Because in the end your customer does not care if you write a single line of code. She just wants her problem to be solved. Solving problems is the purpose of any contractor. Code must be treated just as a means, a tool we know how to handle very well. But if we´re really trying to be craftsmen then we should be conscious about exactly that and act ethically. That means we must never be so focused on our tool as to be unable to suggest better solutions to the problems of our customers than code.   I´m all with Paul when he urges us to “Write great code”. Sure, if you need to write code, then by all means do so. Write the best code you can think of – and then try to improve it. Paul has all the best intentions when he signs Brians “treat code as an end” - but as we all know: “The road to hell is paved with best intentions” ;-) Yes, I can imagine a “hell of code focus”. In fact, I don´t need to imagine it, I´m seeing it quite often. Because code hell is whereever two developers stand together and are so immersed in talking about all sorts of coding tricks, design patterns, code smells, technologies, platforms, tools that they lose sight of the big picture. Talking about TDD or SOLID or refactoring is a sign of consciousness – relative to the “cowboy coders” view of the world. But from yet another point of view TDD, SOLID, and refactoring are just cures for ailments within a system. And I fear, if “Writing great code” is the only focus or the main focus of software development, then we as an industry lose the ability to see that. Focus draws a line around something, it defines a horizon for perceptions and thinking. So if we focus on code our horizon ends where “the land of code” ends. I don´t think that should be our professional attitude.   So what about Software Craftsmanship as the next big thing after Agility? I think Software Craftsmanship has an important message for all software developers and beyond. But to make it the successor of the Agility movement seems to miss a point. Agility never claimed to solve all software development problems, I´d say. So to blame it for having missed out on certain aspects of it is wrong. If I had to summarize Agility in one word I´d say “Value”. Agility put value for the customer back in software development. Focus on delivering value early and often – that´s Agility´s mantra. All else follows from that. And I ask you: Is that obsolete? Is delivering value not hip anymore? No, sure not. That´s our very purpose as software developers. So how can Agility become obsolete and need to be replaced? We need to do away with this “either/or”-thinking. It´s either Agility or Lean or Software Craftsmanship or whatnot. Instead we should start integrating concepts and movements. Think “both/and”. Think Agility plus Software Craftsmanship plus Lean plus whatnot. We don´t neet to tear down anything from a piedestal and replace it with a new idol. Instead we should do away with piedestals and arrange whatever is helpful is a circle. Then we can turn to concepts, movements for whatever they are best. After 10 years of Agility we should be able to identify what it was good at – and keep that. Keep Agility around and add whatever Agility was lacking or never concerned with. Add whatever is at the core of Software Craftsmanship. Add whatever is at the core of Lean etc. But don´t call out the age of Post-Agility. Because it better never will end. Because once we start to lose Agility´s core we´re losing focus of the customer.

    Read the article

  • link to executable file in ubuntu 11.10 with cairo dock

    - by ragv
    I am using ubuntu 11.10 with cairo dock in an acer aspire with core i3 2nd gen processor with intel graphics. I downloaded a program gingkocad as a tar file, opened the package with archive manager and created a folder for the package contents. To run the program i need to click the executable to run in terminal. Is there a way to create a shortcut to run the executable from an icon in cairo dock ? Please help. ragv

    Read the article

  • How to allocate more ram to minecraft?

    - by harikrishnan
    Minecraft lags badly in my linux than windows! in windows i dont find trouble playing it! My system specs : processor : AMD Athlon x2 260 3.2ghz dual core Graphic card : Ati radeon 3000 series ram : 4gb (3.25 usable) I have ubuntu 12.04 and I want allocate more ram to minecraft! I have 4gb physical ram (ONLY 3.25 USABLE) I have openjdk 7 ALSO TELL ME OTHER WAYS TO RUN MINECRAFT SMOOTHLY! PLEASE HELP!!!!!!!!!!!!

    Read the article

  • Will Ubuntu work on a Dell Inspiron 15R?

    - by Daniel Davis
    I want to know how to install Ubuntu on a Dell Inspiron 15R with Intel Core i3, 3GB RAM, Intel HD graphics and 320GB HDD. I've heard some people have had issues with the wireless card. Also, I tried a few days ago to install the 64-bit version of Ubuntu 10.10 and the installaten hung on the "Who are you" screen... Can anyone help me, I really want to get into Linux but without having any problems with the laptops drivers.

    Read the article

  • Chrome Apps Office Hours: Building Apps with Web Intents

    Chrome Apps Office Hours: Building Apps with Web Intents Ask and vote for questions at: goo.gl Web Intents are the core mechanism for building interconnected apps on the Chrome platform. Join Paul Kinlan and Paul Lewis next week as we show you how to build client apps that send data to other web apps, and a service app that will receive input from any intent invocation. From: GoogleDevelopers Views: 0 0 ratings Time: 00:00 More in Science & Technology

    Read the article

  • Git workflow for small teams

    - by janos
    I'm working on a git workflow to implement in a small team. The core ideas in the workflow: there is a shared project master that all team members can write to all development is done exclusively on feature branches feature branches are code reviewed by a team member other than the branch author the feature branch is eventually merged into the shared master and the cycle starts again The article explains the steps in this cycle in detail: https://github.com/janosgyerik/git-workflows-book/blob/small-team-workflow/chapter05.md Does this make sense or am I missing something?

    Read the article

  • GDL Presents: Creative Sandbox | Geo API

    GDL Presents: Creative Sandbox | Geo API Tune in to hear about two cool, innovative campaigns that use the Geo API, Nature Valley Trail View and Band of Bridges, from the core creative teams at McCann Erickson NY, Goodby Silverstein & Partners and Famous Interactive in conversation with a Google Maps product expert. They'll talk about how they pushed the possibilities of the Geo API - and will inspire you to do the same. From: GoogleDevelopers Views: 23 1 ratings Time: 52:32 More in Science & Technology

    Read the article

  • Entity framework support for table valued functions and thus full text

    - by simonsabin
    One of my most popular posts with over 10, 000 hits is how to enable full text when using LINQ to SQL http://sqlblogcasts.com/blogs/simons/archive/2008/12/18/LINQ-to-SQL---Enabling-Fulltext-searching.aspx , core to this is the use of a table valued function. I’m therefore interested to see that Entity Framework will support table valued functions in the next release for more details have a read of the efdesign blog http://blogs.msdn.com/b/efdesign/archive/2011/01/21/table-valued-function-support...(read more)

    Read the article

  • Le Khronos Group publie les spécifications de OpenGL 3.3 et 4.0

    Le Khronos Group publie les spécifications de OpenGL 3.3 et 4.0 Déjà deux ans après la sortie d'OpenGl 3.x, le Khronos Group nous offre le même jour les spécifications des nouvelles versions d'OpenGL : La version 3.3 et la version 4.0 Pour ces nouvelles versions la séparation Core et Compatibility demeurent et, nouveauté pour le GLSL, les versions ont dorénavant le même nom que la version de l'API sous laquelle elles ont été sortis. On nous promet aussi une version 4.0 optimisée, moins dépendante du CPU, notamment concernant la tesselation... N'étant pas familier a OpenGL je n'oserais en dire plus pour les plus curieux voici le lien :

    Read the article

  • Music Notation Editor - Refactoring view creation logic elseware

    - by Cyril Silverman
    Let me preface by saying that knowing some elementary music theory and music notation may be helpful in grasping the problem at hand. I'm currently building a Music Notation and Tablature Editor (in Javascript). But I've come to a point where the core parts of the program are more or less there. All functionality I plan to add at this point will really build off the foundation that I've created. As a result, I want to refactor to really solidify my code. I'm using an API called VexFlow to render notation. Basically I pass the parts of the editor's state to VexFlow to build the graphical representation of the score. Here is a rough and stripped down UML diagram showing you the outline of my program: In essence, a Part has many Measures which has many Notes which has many NoteItems (yes, this is semantically weird, as a chord is represented as a Note with multiple NoteItems, individual pitches or fret positions). All of the relationships are bi-directional. There are a few problems with my design because my Measure class contains the majority of the entire application view logic. The class holds the data about all VexFlow objects (the graphical representation of the score). It contains the graphical Staff object and the graphical notes. (Shouldn't these be placed somewhere else in the program?) While VexFlowFactory deals with actual creation (and some processing) of most of the VexFlow objects, Measure still "directs" the creation of all the objects and what order they are supposed to be created in for both the VexFlowStaff and VexFlowNotes. I'm not looking for a specific answer as you'd need a much deeper understanding of my code. Just a general direction to go in. Here's a thought I had, create an MeasureView/NoteView/PartView classes that contains the basic VexFlow objects for each class in addition to any extraneous logic for it's creation? but where would these views be contained? Do I create a ScoreView that is a parallel graphical representation of everything? So that ScoreView.render() would cascade down PartView and call render for each PartView and casade down into each MeasureView, etc. Again, I just have no idea what direction to go in. The more I think about it, the more ways to go seem to pop into my head. I tried to be as concise and simplistic as possible while still getting my problem across. Please feel free to ask me any questions if anything is unclear. It's quite a struggle trying to dumb down a complicated problem to its core parts.

    Read the article

  • Intel Sandy Bridge : les constructeurs rappellent les machines défectueuses et Gigabyte propose un utilitaire d'optimisation

    Intel Sandy Bridge : les constructeurs rappellent les machines défectueuses Et Gigabyte propose un utilitaire d'optimisation pour les cartes mère P67 et H67 Mise à jour du 04/02/2011 par Idelways Les retombées de la défaillance découverte sur les puces Sandy Bridge d'Intel (lire ci-devant) commencent à se faire sentir, les annonces des fabricants se multiplient pour limiter les dégâts et rassurer leurs consommateurs. Le Sud-Coréen Samsung propose par exemple la restitution de tous les ordinateurs vendus équipés de chipsets compatible avec les puces Sandy Bridge incriminées, soit six modèles commercialisés en Coré...

    Read the article

  • BizTalk 2010 Certification Exam

    - by Paul Petrov
    I took a shot at new (to me) certification exam for BizTalk 2010. I was able to pass it without any preparation just based on the experience. That does not mean this exam is a very simple one. Comparing to previous (2006 R2) it covers some new areas (like WCF) and has some demanding questions and situation to think about. But the most challenging factor is broad feature coverage. Overall, the impression that if BizTalk continues to grow in scope it’s better to create separate exams for core functionality and extended features (like EDI, RFID, LOB adapters) because it’s really hard to cover vast array of BizTalk capabilities. As far as required knowledge and questions allocation I think Microsoft description is on target. There were definitely more questions on deployment, configuration and administration aspects comparing to previous exam. WCF and WCF based adapters now play big role and this topic was covered well too. Extended functionality is claimed at 13% of the exam, I felt there were plenty of RFID questions but not many EDI, that’s why I thought it’d be useful to split exam into two to cover all of them equally. BRE is still there and good, cause it’s usually not very known/loved feature of the package. At the and, for those who plan to get certified, my advice would be to know all those areas of BizTalk for guaranteed passing: messaging and orchestrations, core adapters, routing, patterns; development of all artifacts and orchestrations; debugging and exceptions handling; packaging, deployment, tracking and administration; WCF bindings and adapters; BAM, BRE, RFID, EDI, etc. You may get by not knowing one smaller non-essential part (like I did with RFID, for example). In such case you better know all other areas very well to cover for the weak spot. If there more than one whiteouts in the knowledge it’s good idea to study and prepare: MSDN, blogs, virtual labs and good VM to play with can help when experience is not enough. So best wishes and good skill to you in passing this certification!

    Read the article

  • Low resolution Dektop intel i7 3770 and intel board DH67BL

    - by rtorres
    I installed Ubuntu 12.04.1 in a desktop with the following specs: CPU: Intel(R) Core(TM) i7-3770 CPU @ 3.40GHz Motherboard: Intel DH67B However the monitor is not identified (Monitor: Unknown) such that maximum resolution is 1024x768. This occurs with Samsung Syncmaster 2033 (resolution 1900x600), and is the same with ViewSonic VX2453mh-LED (resolution 1920x1080). I'd be very grateful if anyone could give me a suggestion as to how to fix the resolution.

    Read the article

  • ADF Essentials - free version of ADF available for any app server!

    - by Lukasz Romaszewski
    Hello,  that's great news, finally anyone can create and deploy an ADF application on any application server including Oracle's open source Glassfish server without any license! You can use core ADF functionality, namely: Oracle ADF Faces Rich Client Components Oracle ADF Controller Oracle ADF Model Oracle ADF Business Components Some more enterprise grade functionalities still require purchasing the license, among the others: ADF Security (you can use standard JEE security or third party frameworks) MDS (customizations) Web Service Data Control (workaround - use WS proxy and wrap it as a Pojo DC!) Remote Task Flows HA and Clustering You can find more information about this here

    Read the article

  • SQLbits London 2012 - Demos

    - by Adam Machanic
    Thanks to everyone who attended my sessions last Friday and Saturday at SQLbits! It was great to meet many new people, not to mention spending some time exploring one of my favorite cities, London. Attached are the demos for each of the two talks I delivered: Query Tuning Mastery: The Art of and Science of Manhandling Parallelism As a database developer, your job boils down to one word: performance. And in today's multi-core-driven world, query performance is very much determined by how well you're...(read more)

    Read the article

  • Stairway to PowerPivot and DAX - Level 4: The DAX BLANK() Function

    Business Intelligence architect and author Bill Pearson exposes the DAX BLANK() function, and then provides some hands-on exposure to its use in managing empty values underlying our PowerPivot model designs. Save 45% on our top SQL Server database administration tools. Together they make up the SQL DBA Bundle, which supports your core tasks and helps your day run smoothly. Download a free trial now.

    Read the article

  • How do you manage extensibility in your multi-tenant systems?

    - by Brian MacKay
    I've got a few big web based multi-tenant products now, and very soon I can see that there will be a lot of customizations that are tenant specific. An extra field here or there, maybe an extra page or some extra logic in the middle of a workflow - that sort of thing. Some of these customizations can be rolled into the core product, and that's great. Some of them are highly specific and would get in everyone else's way. I have a few ideas in mind for managing this, but none of them seem to scale well. The obvious solution is to introduce a ton of client-level settings, allowing various 'features' to be enabled on per-client basis. The downside with that, of course, is massive complexity and clutter. You could introduce a truly huge number of settings, and over time various types of logic (presentation, business) could get way out of hand. Then there's the problem of client-specific fields, which begs for something cleaner than just adding a bunch of nullable fields to the existing tables. So what are people doing to manage this? Force.com seems to be the master of extensibility; obviously they've created a platform from the ground up that is super extensible. You can add on to almost anything with their web-based UI. FogBugz did something similiar where they created a robust plugin model that, come to think of it, might have actually been inspired by Force. I know they spent a lot of time and money on it and if I'm not mistaken the intention was to actually use it internally for future product development. Sounds like the kind of thing I could be tempted to build but probably shouldn't. :) Is a massive investment in pluggable architecture the only way to go? How are you managing these problems, and what kind of results are you seeing? EDIT: It does look as though FogBugz handled the problem by building a fairly robust platform and then using that to put together their screens. To extend it you create a DLL containing classes that implement interfaces like ISearchScreenGridColumn, and that becomes a module. I'm sure it was tremendously expensive to build considering that they have a large of devs and they worked on it for months, plus their surface area is perhaps 5% of the size of my application. Right now I am seriously wondering if Force.com is the right way to handle this. And I am a hard core ASP.Net guy, so this is a strange position to find myself in.

    Read the article

  • Map and fill texture using PBO (OpenGL 3.3)

    - by NtscCobalt
    I'm learning OpenGL 3.3 trying to do the following (as it is done in D3D)... Create Texture of Width, Height, Pixel Format Map texture memory Loop write pixels Unmap texture memory Set Texture Render Right now though it renders as if the entire texture is black. I can't find a reliable source for information on how to do this though. Almost every tutorial I've found just uses glTexSubImage2D and passes a pointer to memory. Here is basically what my code does... (In this case it is generating an 1-byte Alpha Only texture but it is rendering it as the red channel for debugging) GLuint pixelBufferID; glGenBuffers(1, &pixelBufferID); glBindBuffer(GL_PIXEL_UNPACK_BUFFER, pixelBufferID); glBufferData(GL_PIXEL_UNPACK_BUFFER, 512 * 512 * 1, nullptr, GL_STREAM_DRAW); glBindBuffer(GL_PIXEL_UNPACK_BUFFER, 0); GLuint textureID; glGenTextures(1, &textureID); glBindTexture(GL_TEXTURE_2D, textureID); glTexImage2D(GL_TEXTURE_2D, 0, GL_R8, 512, 512, 0, GL_RED, GL_UNSIGNED_BYTE, nullptr); glBindTexture(GL_TEXTURE_2D, 0); glBindTexture(GL_TEXTURE_2D, textureID); glBindBuffer(GL_PIXEL_UNPACK_BUFFER, pixelBufferID); void *Memory = glMapBuffer(GL_PIXEL_UNPACK_BUFFER, GL_WRITE_ONLY); // Memory copied here, I know this is valid because it is the same loop as in my working D3D version glUnmapBuffer(GL_PIXEL_UNPACK_BUFFER); glBindBuffer(GL_PIXEL_UNPACK_BUFFER, 0); And then here is the render loop. // This chunk left in for completeness glUseProgram(glProgramId); glBindVertexArray(glVertexArrayId); glBindBuffer(GL_ARRAY_BUFFER, glVertexBufferId); glEnableVertexAttribArray(0); glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 20, 0); glVertexAttribPointer(0, 2, GL_FLOAT, GL_FALSE, 20, 12); GLuint transformLocationID = glGetUniformLocation(3, 'transform'); glUniformMatrix4fv(transformLocationID , 1, true, somematrix) // Not sure if this is all I need to do glBindTexture(GL_TEXTURE_2D, pTex->glTextureId); GLuint textureLocationID = glGetUniformLocation(glProgramId, "texture"); glUniform1i(textureLocationID, 0); glDrawArrays(GL_TRIANGLES, Offset*3, Triangles*3); Vertex Shader #version 330 core in vec3 Position; in vec2 TexCoords; out vec2 TexOut; uniform mat4 transform; void main() { TexOut = TexCoords; gl_Position = vec4(Position, 1.0) * transform; } Pixel Shader #version 330 core uniform sampler2D texture; in vec2 TexCoords; out vec4 fragColor; void main() { // Output color fragColor.r = texture2D(texture, TexCoords).r; fragColor.g = 0.0f; fragColor.b = 0.0f; fragColor.a = 1.0; }

    Read the article

< Previous Page | 124 125 126 127 128 129 130 131 132 133 134 135  | Next Page >