Search Results

Search found 7216 results on 289 pages for 'low cost'.

Page 14/289 | < Previous Page | 10 11 12 13 14 15 16 17 18 19 20 21  | Next Page >

  • A good interpreted language for a small embedded project

    - by Earlz
    I have an mbed which has a small ARM Cortex M3 on it. Basically, my effective resources for the project are ~25Kb of RAM and ~400Kb of Flash. For I/O I'll have a PS/2 keyboard, a VGA framebuffer(with character output), and an SD card for saving/loading programs(up to a couple of Mb maybe) The reason I ask this here is because I'm trying to figure out what programming language to implement on the thing. I'm looking for an interpreted language that's easy for me to implement, and won't break the bank on my resources. I also intend for this to be at least possible to write on th device itself, though the editor can be interpreted(yay bootstrapping) Anyway, I've looked at a few simple languages. Some nice candidates: Forth BASIC Scheme? Has anyone done something like this or know of any languages that can fit this bill or have comments about my three candidates so far?

    Read the article

  • What Are Some Advantages/Disadvantages of Using C over Assembly?

    - by Daniel
    I'm currently studying engineering in Telecommunications and Electronics and we have migrated from assembler to C in microprocessor programming. I have doubts that this is a good idea. What are some advantages and disadvantages of C compared to assembly? The advantages/disadvantages I see are: Advantages: I can tell that C syntax is a lot easier to learn than Assembler syntax. C is easier to use for making more complex programs. Learning C is somehow more productive than learning assembler cause there is more developing stuff around C than Assembler. Disadvantages: Assembler is a lower level programming language than C,so this makes it a good for programming directly to hardware. Is a lot more flexible alluding you to work with memory,interrupts,micro-registers,etc.

    Read the article

  • On what basis would you split donation money among your open source team members without any strife?

    - by Vigneshwaran
    I am a developer of an open source project which is hosted in SourceForge. It started out as a little app then after some releases, it got more and more popular and it started consuming more time and responsibility from me. So I have enabled the donation option in SourceForge. I'm passionate to continue developing it for free but if (ever) any money comes in, how should I split it with my team? Should I split the amount equally among the number of team members? (50-50 as it is two-member team now) Number of classes, commits or any other valuable submissions by team members? Any other idea? What would you do in such situation? Please give your opinions. I hope this question will be useful for others.

    Read the article

  • Were the first assemblers written in machine code?

    - by The111
    I am reading the book The Elements of Computing Systems: Building a Modern Computer from First Principles, which contains projects encompassing the build of a computer from boolean gates all the way to high level applications (in that order). The current project I'm working on is writing an assembler using a high level language of my choice, to translate from Hack assembly code to Hack machine code (Hack is the name of the hardware platform built in the previous chapters). Although the hardware has all been built in a simulator, I have tried to pretend that I am really constructing each level using only the tools available to me at that point in the real process. That said, it got me thinking. Using a high level language to write my assembler is certainly convenient, but for the very first assembler ever written (i.e. in history), wouldn't it need to be written in machine code, since that's all that existed at the time? And a correlated question... how about today? If a brand new CPU architecture comes out, with a brand new instruction set, and a brand new assembly syntax, how would the assembler be constructed? I'm assuming you could still use an existing high level language to generate binaries for the assembler program, since if you know the syntax of both the assembly and machine languages for your new platform, then the task of writing the assembler is really just a text analysis task and is not inherently related to that platform (i.e. needing to be written in that platform's machine language)... which is the very reason I am able to "cheat" while writing my Hack assembler in 2012, and use some preexisting high level language to help me out.

    Read the article

  • How do great enterprises estimate software development efforts?

    - by Ed Pichler
    I was learning about how to estimate software development effort, and would like to know how successful enterprises estimate their projects. How they do to know how much time a system will spend to be developed? What are the modern techniques to do this? What are the techniques used by these modern enterprises? Some articles and interviews of employees of those enterprises would be interesting. I asked on Project Management site of StackExchange too.

    Read the article

  • Why is learning assembly language seen as a disadvantage?

    - by cprogcr
    I was recently reading an article about making a compiler, and one of the disadvantages mentioned about making a compiler instead of interpreter, was "Learning Assembly language".I understand that perhaps it takes a little more time to learn ASM than it would take for a high level language. But why should it be seen as a disadvantage? And this is not the first time, I mean there are a lot of articles which see ASM as a disadvantage or not important.Personally I find ASM interesting and not at all as a "disadvantage".

    Read the article

  • What's the best way to sell ReSharper to management? [closed]

    - by Jackson Pope
    Possible Duplicate: How do you convince your boss to buy useful tools like Resharper, LinqPad? I've recently started a new job developing code in C# and ASP.Net. At a previous employer I've used ReSharper from JetBrains and I loved it. I've downloaded the free trial in my new job, as have several of my new colleagues on my recommendation. Everyone thinks it's great. But now our trials are coming to an end and it's time to buy or say goodbye. I've been reliably informed that getting money for tools from senior management is like trying to get blood from a stone, so how can I convince them to loosen their grip on the purse strings and buy it for our team (of seven developers)? Does anyone have any experience of convincing management of the benefits of refactoring tools? I feel the benefit every second I use it, but I'm having difficulty thinking of how to explain the concrete benefits to a manager who only think

    Read the article

  • How to start embedded development for developing a handheld game console?

    - by Quakeboy
    I work as a iPhone app developer now, so I know a bit of c, c++ and objective c. Also have fiddled with Java and many other. All of them have been just high level application/games development. My final goal is to make a handheld game console. More like a home made NES/SNES handheld console or even an Atari. I have found out about RaspberryPI and Arduino. But I need more information about how to approach this. 1) How Do I learn to pick the best board/cpu/controller/GPU/LCD screen/LCD controller etc? 2) Will learning to make a NES emulator first help me understand this field? If so are there any tutorials?

    Read the article

  • How to have an Arduino wait until it receives data over serial?

    - by SonicDH
    So I've wired up a little robot with a sound shield and some sensors. I'm trying to write a sketch that will let check the sensors. What I'd like for it to do is print out a little menu over serial, wait until the user sends a selection, jump to the function that matches their selection, then (once the function is done) jump back and print the menu again. Here's what I've written, but I'm not a that good of a coder, so it doesn't work. Where am I going wrong? #include <Servo.h> Servo steering; Servo throttle; int pos = 0; int val = 0; void setup(){   Serial.begin(9600);   throttle.write(90);   steering.write(90);   pinMode(A0, INPUT);   pinMode(7, INPUT);   char ch = 0; } void loop(){   Serial.println("Menu");   Serial.println("--------------------");   Serial.println("1. Motion Readout");   Serial.println("2. Distance Readout");   Serial.println("3. SD Directory Listing");   Serial.println("4. Sound Test");   Serial.println("5. Car Test");   Serial.println("--------------------");   Serial.println("Type the number and press enter");   while(char ch = 0){   ch = Serial.read();}   char ch;   switch(ch)   {     case '1':     motion();   }    ch = 0; } //menu over, lets get to work. void motion(){   Serial.println("Haha, it works!"); } I'm pretty sure a While loop is the right thing to do, but I'm probably implementing it wrong. Can anyone shed some light on this?

    Read the article

  • How to fix "The system is running in low-graphics mode" error?

    - by jokerdino
    Note: This is an attempt to create a canonical question that covers all instances of "low-graphics mode" error that occurs to a user, including but not limited to installation of wrong drivers, incorrect or invalid lightdm greeters, low disk space, incorrect installation of graphics card like ATI and Nvidia, incorrect configuration of xorg.conf file while setting up multiple monitors among others. If you are experiencing the "low-graphics mode" error when trying to login but none of the following answers work for you, please do ask a new question and then update the answers of this canonical question as and when your new question gets answered. When I try to boot into my computer, I am getting this error: The system is running in low-graphics mode Your screen, graphics cards, and input device settings could not be detected correctly. You will need to configure these yourself. How do I fix the failsafe X mode and login into my computer? Answer index: The greeter is invalid

    Read the article

  • What interface does python use to implement sockets?

    - by user2738698
    When I programmed in python, I believe I interfaced with the transport layer using sockets. If python was programmed by humans, they must have used an interface that was "lower" than sockets, to provide us with the interface to sockets. I assume firewalls, also programmed by humans, use interfaces of lower layers in the same manner, so is there a way to access such lower layers, in terms of programming?

    Read the article

  • postgres - ERROR: syntax error at or near "COST"

    - by cino21122
    EDIT Taking COST 100 out made the command go through, however, I'm still unable to run my query because it yields this error: ERROR: function group_concat(character) does not exist HINT: No function matches the given name and argument types. You may need to add explicit type casts. The query I'm running is this: select tpid, group_concat(z) as z, group_concat(cast(r as char(2))) as r, group_concat(to_char(datecreated,'DD-Mon-YYYY HH12:MI am')) as datecreated, group_concat(to_char(datemodified,'DD-Mon-YYYY HH12:MI am')) as datemodified from tpids group by tpid order by tpid, zip This function seems to work fine locally, but moving it online yields this error... Is there something I'm missing? CREATE OR REPLACE FUNCTION group_concat(text, text) RETURNS text AS $BODY$ SELECT CASE WHEN $2 IS NULL THEN $1 WHEN $1 IS NULL THEN $2 ELSE $1 operator(pg_catalog.||) ',' operator(pg_catalog.||) $2 END $BODY$ LANGUAGE 'sql' IMMUTABLE COST 100; ALTER FUNCTION group_concat(text, text) OWNER TO j76dd3;

    Read the article

  • Best Java 'framework' for LOW-END 3D Graphics?

    - by CodeJustin.com
    I've made my share of 2D games on various platforms but I have never developed a 3D game. I want to make a small "mmorpg". I already made my server in python and it works just fine with my flash 2D game but I decided I want to step it up and try out 3D. I want to make a 3D game for the web browser and I think Java might be a good choice for this. So basically I'm just looking for a straight forward and well documents 'framework' to make LOW-END 3D games. Keep in mind that I will be targeting peoples with very low-end PC's (plus my 3d modeling skills aren't great so I wouldn't mind hiding it somewhat, haha)

    Read the article

  • recursion tree and binary tree cost calculation

    - by Tony
    Hi all, I've got the following recursion: T(n) = T(n/3) + T(2n/3) + O(n) The height of the tree would be log3/2 of 2. Now the recursion tree for this recurrence is not a complete binary tree. It has missing nodes lower down. This makes sense to me, however I don't understand how the following small omega notation relates to the cost of all leaves in the tree. "... the total cost of all leaves would then be Theta (n^log3/2 of 2) which, since log3/2 of 2 is a constant strictly greater then 1, is small omega(n lg n)." Can someone please help me understand how the Theta(n^log3/2 of 2) becomes small omega(n lg n)?

    Read the article

  • Question about mysql indexes on low to medium cardinality columns

    - by Kevin J
    I have a general question about the way that database indexing works, particularly in mysql. Let's say I have a table with a million rows with a column "ClientID" that is distributed relatively equally among 30 values. Thus, this column is very low cardinality (30) relative to the primary key (1 million). Now, I understand that you shouldn't create indexes on low cardinality fields. However, in this case, queries are only ever done with one of the 30 clientIDs. Thus, wouldn't creating an index on ClientID be helpful, as the search space is automatically reduced to 1/30th what it normally would be? Or is my understanding of how the index works flawed? Thanks

    Read the article

  • Need help with Drupal bulk mail low open rate for legitimate mailing list

    - by Ron Williams
    I've moved from constant contact to Drupal Simplenews/Mimemail/SMTP. Previously the open rate was around 50% for constant contact, but now it's 4-5% for the same list via the mentioned setup. Mail is getting out from the server, but it's having an issue anyway. Here's the setup: -The e-mail list consists of approximately 80,000 addresses which is queued at 10,000 e-mails per cron run (which runs hourly). -The server is a Dual Core2Quad machine with 2GB of RAM. -When mail is being sent, the mail queue will usually go up to ~1000 at the beginning of the hour before reducing to ~250 by the time the next cron occurs. -Newsletter is themed to display custom style for newsletter on send -Newsletter is received by some, but appears to be bounced by many (based on low open rate_ -I've added SPF, domain keys, and a PTR record to the DNS -Server hostname (listed in ptr) is different from hosted domain -Very low spam number via Spamassassin -IP and domain are not blacklisted -Mail goes out via SMTP module on delivery. Any ideas?

    Read the article

  • Rails - Preventing users from contributing to website when there score is too low - callback / obser

    - by adam
    A User can add a Sentence directly on my website, via twitter or email. To add a sentence they must have a minimum score. If they don't have the minimum score they cant post the sentence and a warning message is either flashed on the website, sent back to them via twitter or email. So I'm wondering how best to code this check. Im thinking a sentence observer. So far my thoughts are in before_create score_sufficient() - score ok = save - score too low = do not save In the case of too low i need to return some flag so that the calling code can then fire off teh relevant warning. What type of flag should I return? False is too ambiguous as that could refer to validation. I could raise an exception but that doesn't sound right or I could return a symbol? Is this even the right approach? What's the best way to code this?

    Read the article

  • Low memory with 640Kb of live bytes?

    - by Chiodo
    Hello, i've a problem with my application that need to display a lot of images and video. After running ObjectAlloc tool, i see that the live bytes is 640Kb and the overall memory is 31,54Mb when the application crash. In the organizer i get a "low memory" report so i guess the app crashed because low memory but the ObjectAllocation data don't make any sense to me... Any ideas? This is the Organizer crash log: Incident Identifier: CDCAF38C-CFFD-4316-9C4A-5C8E37794B49 CrashReporter Key: 65390aeb97b2b81076576c3e33b025feb5db9202 OS Version: iPhone OS 3.1.3 (7E18) Date: 2010-05-19 10:07:19 +0200 Free pages: 372 Wired pages: 12260 Purgeable pages: 0 Largest process: DTMobileIS Processes Name UUID Count resident pages ATreeTest <1d51c3a5fef8b747c3a1be9405bdd52a 1150 (jettisoned) (active) DTMobileIS <69c3fa96db2f29474d62964aa1a69bfa 3316 notification_pro <8a7725017106a28b545fd13ed58bf98c 68 mediaserverd <3d3800d6acfff050e4d0ed91cbe2467e 464 (jettisoned) syslogd <8eddddc00294d5615afded36ee3f1b62 56 (jettisoned) apsd <32070d91b216d806973c8f1b1d8077a4 173 SpringBoard <324939a437d1cca1fa4af72d9f5d0eba 2475 (jettisoned) (active) accessoryd <8f21c8b376d16e2ccb95ed6d21d8317a 99 (jettisoned) notification_pro <8a7725017106a28b545fd13ed58bf98c 64 ptpd 129 notifyd <591dd4dd804b4b8741f52335ea1fa4ab 64 CommCenter 167 configd <85efd41aceac34ccc0019df76623c7a9 294 fairplayd 91 mDNSResponder 101 lockdownd <80d2bd44c0bcca273d48ce52010f7e65 285 launchd 71 End

    Read the article

  • How to implement low pass filter using java

    - by chai
    Hello Everyone, I am trying to implement a low pass filter in Java. My requirement is very simple,I have to eliminate signals beyond a particular frequency (Single dimension). Looks like Butterworth filter would suit my need. Now the important thing is that CPU time should be as low as possible. There would be close to a million sample the filter would have to process and our users don't like waiting too long. Are there any readymade implementation of Butterworth filters which has optimal algorithms for filtering. Regards, Chaitannya

    Read the article

  • Apache jamming at low load average

    - by Apikot
    Apache seems to stop responding sometimes even though apache processes are still running. After restarting apache, the load average usually shoots up from 1 - 2 to 13 - 15 in a matter of seconds. What would the cause of this be, or how could I find out why apache stops serving? My httpd.conf is: <IfModule mpm_prefork_module> StartServers 8 MinSpareServers 5 MaxSpareServers 20 MaxClients 50 ServerLimit 50 MaxRequestsPerChild 4000 </IfModule> It's running on an EC2 c1.medium (1.7 GB of memory) Thanks

    Read the article

  • Win 7 running slowly with low CPU usage and memory

    - by guywhoneedsahand
    I have a relatively new (under 2 yrs old) windows 7 machine. It has 9GB of RAM, and an i7 core CPU (930 @ 2.8GHz w/ 8 CPUs). After about 8 months since a clean install, I noticed my computer was running slowly. I figure it was fragmentation etc, and I did a complete wipe & clean reinstall. However, my problems are somehow persisting. The computer is running painfully slowly (but in leaps and bounds - sometimes it will work fine for 3 hrs, then suddenly freeze up just from clicking the start button). The 'freezes' happen randomly - not during any especially intensive computing. I initially thought something might be eating through my CPU and/or Memory, but Task Manager indicates that neither the CPU or Memory spike. In fact, even during serious lag, CPU usage remains at less than 5% and Memory at ~ 1.5GB. It's beyond me why a fresh install on a powerful machine is performing so poorly... and it certainly is frustrating! What could be causing the poor performance, and what can I do to fix it?

    Read the article

  • High load average, low CPU and IO (Centos 5.7)

    - by Ben
    A Drupal 7 site with CiviCRM, after running smoothly for a year on a 1&1 VPS suddenly became unresponsive. Now pages eventually load, but can take more than a minute. Looking at resource use in Virtuozzo, the load average carries a warning, and has remained above 1. While I understand this isn't particularly high, this is a change from when the site was working. Here is a typical snapshot of top: top - 03:10:32 up 3:21, 1 user, load average: 1.16, 1.22, 1.30 Tasks: 43 total, 1 running, 42 sleeping, 0 stopped, 0 zombie Cpu(s): 0.1%us, 0.1%sy, 0.1%ni, 99.7%id, 0.0%wa, 0.0%hi, 0.0%si, 0.0%st Mem: 2097152k total, 1015112k used, 1082040k free, 0k buffers Swap: 0k total, 0k used, 0k free, 0k cached CPU idle level never seems to go much below 70%. wa is virtually always at 0. There appears to be lots of free memory. And here is some vmstat output, again showign wa at 0, plenty of free memory, and an idle CPU: procs -----------memory---------- ---swap-- -----io---- --system-- -----cpu------ r b swpd free buff cache si so bi bo in cs us sy id wa st 2 0 0 1100872 0 0 0 0 2783 23672 0 538 1 0 99 0 0 1 0 0 1100872 0 0 0 0 0 16 0 101754 0 0 100 0 0 0 0 0 1100872 0 0 0 0 0 17 0 103133 0 0 100 0 0 0 0 0 1100872 0 0 0 0 0 1 0 102080 0 0 100 0 0 1 0 0 1100872 0 0 0 0 0 6 0 99881 0 0 100 0 0 0 0 0 1100872 0 0 0 0 0 1 0 105187 0 0 100 0 0 I've spoken to 1&1 but they don't have any ideas as to what could be causing the high load average. Instead they suggested an upgrade :) I've looked for processes that might be causing this, examined MySQL showprocesslist, and restarted the container with no result. Does anyone have more troubleshooting suggestions or insights?

    Read the article

  • PELSOFT LAB DELHI ELAN LAPTOP CHINEESE MAKE LOW QUALITY ONE [closed]

    - by PREETI HARI
    I have purchase one elan laptop from pelsoft lab delhi after a lot of follow up calls from their call centers , in one month i found this laptop is of no use, it is of no use ,company failed to provide service , in market on enquiry i got answer that this lap ttop is of chineese make and all components are of non standard and cannot be replaced or repaired , is any body know how to format and repair this kind of system ,other i will loos 22000 rs at astretch pl help

    Read the article

  • Low framerate on background apps

    - by user1698923
    My problem is that when a game is running in the foreground, in Full Screen mode, any applications on my second monitor (such as youtube videos, videos, not app specific) drop their frame-rate to about 2-3 FPS. It seems like some sort of power management option that I can't track down. As far as I can tell, it's not due to the GPU not being able to keep up. For instance, my PC can play League of Legends at about 280FPS when the framerate is uncapped. If i cap it at 60FPS using the in-game option, it has no affect on the performance of the background app. Summary Operating System Windows 8 Pro 64-bit CPU Intel Core i7 3820 @ 3.60GHz 42 °C Sandy Bridge-E 32nm Technology RAM 12.0GB Triple-Channel DDR3 @ 533MHz (7-7-7-20) Motherboard Gigabyte Technology Co., Ltd. X79-UD3 (SOCKET 0) 37 °C Graphics DELL U2713HM (2560x1440@59Hz) DELL U2713HM (2560x1440@59Hz) 1280MB NVIDIA GeForce GTX 570 (Gigabyte) 58 °C Hard Drives 212GB Volume0 (RAID) 1863GB Western Digital WDC WD20EARS-00MVWB0 (SATA) 36 °C 1863GB Western Digital WDC WD20EARS-00MVWB0 (SATA) 34 °C Optical Drives No optical disk drives detected Audio ASUS Xonar Essence STX Audio Device Operating System Windows 8 Pro 64-bit Computer type: Desktop Graphics Monitor 1 Name DELL U2713HM on NVIDIA GeForce GTX 570 Current Resolution 2560x1440 pixels Work Resolution 2560x1400 pixels State Enabled, Output devices support Multiple displays Extended, Secondary, Enabled Monitor Width 2560 Monitor Height 1440 Monitor BPP 32 bits per pixel Monitor Frequency 59 Hz Device \\.\DISPLAY4\Monitor0 Monitor 2 Name DELL U2713HM on NVIDIA GeForce GTX 570 Current Resolution 2560x1440 pixels Work Resolution 2560x1400 pixels State Enabled, Output devices support Multiple displays Extended, Primary, Enabled Monitor Width 2560 Monitor Height 1440 Monitor BPP 32 bits per pixel Monitor Frequency 59 Hz Device \\.\DISPLAY5\Monitor0 NVIDIA GeForce GTX 570 Manufacturer NVIDIA Model GeForce GTX 570 GPU GF110 Device ID 10DE-1086 Revision A2 Subvendor Gigabyte (1458) Series GeForce GTX 500 Current Performance Level Level 3 Current GPU Clock 845 MHz Current Memory Clock 1900 MHz Current Shader Clock 1690 MHz Voltage 0.988 V Technology 40 nm Die Size 520 mm² Release Date Dec 07, 2010 DirectX Support 11.0 OpenGL Support 5.0 Bus Interface PCI Express x16 Temperature 57 °C Driver version 9.18.13.2018 BIOS Version 70.10.55.00.01 ROPs 40 Shaders 512 unified Memory Type GDDR5 Memory 1280 MB Bus Width 64x5 (320 bit) Filtering Modes 16x Anisotropic Noise Level Moderate Max Power Draw 219 Watts Count of performance levels : 3 Level 1 - "Default" GPU Clock 50 MHz Memory Clock 135 MHz Shader Clock 101 MHz Level 2 - "2D Desktop" GPU Clock 405 MHz Memory Clock 324 MHz Shader Clock 810 MHz Level 3 - "3D Applications" GPU Clock 845 MHz Memory Clock 1900 MHz Shader Clock 1690 MHz Things I've tried: 1) Updating the graphics driver 2) Setting windows power mode to High Performance 3) Reset Nvidia Global Performance settings to default

    Read the article

< Previous Page | 10 11 12 13 14 15 16 17 18 19 20 21  | Next Page >