Search Results

Search found 20659 results on 827 pages for 'agile software'.

Page 469/827 | < Previous Page | 465 466 467 468 469 470 471 472 473 474 475 476  | Next Page >

  • Aplicacion de facturacion libre para Colombia

    - by Yoimir Yamit Castrillon Duque
    Estoy buscando una aplicacion de facturacion (tambien inventario, compras, clientes, proveedores, cuentas, etc) para pequeñas empresas, que se adaptade a las necesidades de Colombia, en google encontre varios ERP como openbravo y adempiere, pero aplicaciones muy grandes y dificiles de manejas, de hecho no puede hacerlas funcionar. Encontre un programa llamado Ubifactura, hecho para facturar en Colomabia, descarge los archivos de codigo fuente en java, pero no tengo ni idea de como ponerlo a funcionar, pues habla de eclipce, de un servsdor CVS, que no tengo ni idea de como poner a funcionar, necesito si alguien me puede ayudar a trabajar con estos archivos java, o me suguieran aplicacion de acuerdo a mis necesidades. La idea es bebeficiar a varias pequeñas empresas de mi pueblo con una aplicacion de estas, ni importa si en entorno windows o ubuntu, la idea es aportarles algo desde el software libre. Saludos y a la espera de respuestas. Toda ayuda es bienvenida. Atte. Yoimir Yamit Castrillon Duque Cimitarra, Santander, Colombia

    Read the article

  • Taking the Plunge - or Dipping Your Toe - into the Fluffy IAM Cloud by Paul Dhanjal (Simeio Solutions)

    - by Greg Jensen
    In our last three posts, we’ve examined the revolution that’s occurring today in identity and access management (IAM). We looked at the business drivers behind the growth of cloud-based IAM, the shortcomings of the old, last-century IAM models, and the new opportunities that federation, identity hubs and other new cloud capabilities can provide by changing the way you interact with everyone who does business with you. In this, our final post in the series, we’ll cover the key things you, the enterprise architect, should keep in mind when considering moving IAM to the cloud. Invariably, what starts the consideration process is a burning business need: a compliance requirement, security vulnerability or belt-tightening edict. Many on the business side view IAM as the “silver bullet” – and for good reason. You can almost always devise a solution using some aspect of IAM. The most critical question to ask first when using IAM to address the business need is, simply: is my solution complete? Typically, “business” is not focused on the big picture. Understandably, they’re focused instead on the need at hand: Can we be HIPAA compliant in 6 months? Can we tighten our new hire, employee transfer and termination processes? What can we do to prevent another password breach? Can we reduce our service center costs by the end of next quarter? The business may not be focused on the complete set of services offered by IAM but rather a single aspect or two. But it is the job – indeed the duty – of the enterprise architect to ensure that all aspects are being met. It’s like remodeling a house but failing to consider the impact on the foundation, the furnace or the zoning or setback requirements. While the homeowners may not be thinking of such things, the architect, of course, must. At Simeio Solutions, the way we ensure that all aspects are being taken into account – to expose any gaps or weaknesses – is to assess our client’s IAM capabilities against a five-step maturity model ranging from “ad hoc” to “optimized.” The model we use is similar to Capability Maturity Model Integration (CMMI) developed by the Software Engineering Institute (SEI) at Carnegie Mellon University. It’s based upon some simple criteria, which can provide a visual representation of how well our clients fair when evaluated against four core categories: ·         Program Governance ·         Access Management (e.g., Single Sign-On) ·         Identity and Access Governance (e.g., Identity Intelligence) ·         Enterprise Security (e.g., DLP and SIEM) Often our clients believe they have a solution with all the bases covered, but the model exposes the gaps or weaknesses. The gaps are ideal opportunities for the cloud to enter into the conversation. The complete process is straightforward: 1.    Look at the big picture, not just the immediate need – what is our roadmap and how does this solution fit? 2.    Determine where you stand with respect to the four core areas – what are the gaps? 3.    Decide how to cover the gaps – what role can the cloud play? Returning to our home remodeling analogy, at some point, if gaps or weaknesses are discovered when evaluating the complete impact of the proposed remodel – if the existing foundation wouldn’t support the new addition, for example – the owners need to decide if it’s time to move to a new house instead of trying to remodel the old one. However, with IAM it’s not an either-or proposition – i.e., either move to the cloud or fix the existing infrastructure. It’s possible to use new cloud technologies just to cover the gaps. Many of our clients start their migration to the cloud this way, dipping in their toe instead of taking the plunge all at once. Because our cloud services offering is based on the Oracle Identity and Access Management Suite, we can offer a tremendous amount of flexibility in this regard. The Oracle platform is not a collection of point solutions, but rather a complete, integrated, best-of-breed suite. Yet it’s not an all-or-nothing proposition. You can choose just the features and capabilities you need using a pay-as-you-go model, incrementally turning on and off services as needed. Better still, all the other capabilities are there, at the ready, whenever you need them. Spooling up these cloud-only services takes just a fraction of the time it would take a typical organization to deploy internally. SLAs in the cloud may be higher than on premise, too. And by using a suite of software that’s complete and integrated, you can dramatically lower cost and complexity. If your in-house solution cannot be migrated to the cloud, you might consider using hardware appliances such as Simeio’s Cloud Interceptor to extend your enterprise out into the network. You might also consider using Expert Managed Services. Cost is usually the key factor – not just development costs but also operational sustainment costs. Talent or resourcing issues often come into play when thinking about sustaining a program. Expert Managed Services such as those we offer at Simeio can address those concerns head on. In a cloud offering, identity and access services lend to the new paradigms described in my previous posts. Most importantly, it allows us all to focus on what we're meant to do – provide value, lower costs and increase security to our respective organizations. It’s that magic “silver bullet” that business knew you had all along. If you’d like to talk more, you can find us at simeiosolutions.com.

    Read the article

  • In centralized version control, is it always good to update often?

    - by janos
    Assuming that: You are in a team developing some software. Your team is using centralized version control in the development process. You are working on a new feature which will surely take several days to complete, and you won't be able to commit before that because it would break the build. Your team members commit something every day that affects some of the files you're working with for your fancy new feature. Since this is centralized version control, you will have to update your local checkout at some point: at least once right before committing the new feature. If you update only once right before your commit, then there might be a lot of conflicts due to the many other changes by your teammates, which could be a world of pain to resolve all at once. Or, you could update often, and even if there are a few conflicts to resolve day by day, it should be easier to do, little by little. Can we say that it is always a good idea to update often?

    Read the article

  • Upgrade from 10.10 to 11.04

    - by hemanta pathak
    On doing an upgrade from 10.10. to 11.04 using Upgrade Manager everything works fine. But on installing a software that bundle its owns runtime environment( loader and system files) and installs the custom runtime ( basically loader) in the location where the native one resides, the above mentioned upgrade fails.(Upgrade starts and after sometime it encounters an error and aborts.) Basically , /usr/bin/dpkg throws up an error on being unable to locate a system shared library in the aforesaid third party runtime folder ( /usr/bin/dpkg should not search the third party runtime folder.Instead it should look at the system default folder) But if we remove the installed third party loader from the default system location /lib and place it in some other location , the upgrade problem goes away. This makes me believe /usr/bin/dpkg invokes(loads) the wrong loader and as such goes looking for the dependent libraries in the third party folder. Can someone take a look at this ? is there some bug with dpkg

    Read the article

  • DLNA Media Server and external subtitles

    - by Lobo
    I'm looking DLNA Media Server with the features above: Support most extended video formats. Reproduce external subtitles in client side. Open source or freeware software. USE CASE: DLNA Media server installed and running on my PC. In my PC, I have /home/myprofile/videos directory where I store all my video files. For example, game.of.thrones.s09.e06.mpg and game.of.thrones.s09.e06.srt Turn on my Smart-TV, connect to my DLNA Media Server (installed on my PC) Play game.of.thrones.s09.e06.mpg file and see the subtitles overlapped. Finally, my question: Is there some DLNA Media server that provides that use case?. Thank you.

    Read the article

  • Algorithmic Forecasting and Pattern Recognition

    - by Ryan King
    Say a user could enter project data into my software. Each project has 2 variables "size" and "work" and they're related but the relationship is not known. Is there a way to programmatically determine the relationship between the variables based on previous data and forecast the amount of work provided if only given the size of the project in the future? For Example, say the user had manually entered the following projects. Project 1 - Size:1, Work: 4 Project 2 - Size:2, Work: 7 Project 3 - Size:3, Work: 10 Project 4 - Size:4, Work: x What should I look into to be able to programmatically determine, that Work = Size*3+1 and therefor be able to say that x=13?

    Read the article

  • Should I concentrate on writing code for money or my studies while in college?

    - by A-Cube
    I am college student of Software Engineering. My worries are that while I am concentrating on my studies, my peers are getting down with the code (e.g. HTML, ASP, PHP, etc) to earn money. Should I be worried that I am not doing coding like them? I was asked to be Microsoft Student Partner but I refused because the person what was doing before me told it was just arranging events. Nothing as such like getting with Microsoft and coding. Should I be writing code and earning money as I still am in 4th semester? I only have C++ as learning language in college. Will my job count on these projects that I do, or should I concentrate on studies for now to get maximum benefit?

    Read the article

  • netflix on ubuntu 12.04

    - by tsi25
    So I have got Ubuntu 12.04 on a system 76 lemur ultra laptop. I installed netflix via terminal with the following chain of commands: sudo apt-get update sudo apt-add-repository ppa:ehoover/compholio sudo apt-get update sudo apt-get install netflix-desktop when it was finished installing, I clicked the icon and something came up asking if you want to install some dependent software, but wouldn't let me interact with the window - so I hit tab and worked my way through that. But my computer shutdown before I could fully install GECKO. Now I have the netflix icon, but when i click it or right click it nothing happens. I had tried uninstalling it with the following commands, sudo apt-add --purge remove netflix-desktop and then reinstalling it but there's no change. does anyone know what I can do to get netflix to run from here? or what I can do to start troubleshooting? I searched around on AskUbuntu but couldn't find any answers to this specific problem.

    Read the article

  • Seperation of project responsibilities in new project

    - by dreza
    We have very recently started a new project (MVC 3.0) and some of our early discussion has been around how the work and development will be split amongst the team members to ensure we get the least amount of overlap of work and so help make it a bit easier for each developer to get on and do their work. The project is expected to take about 6 months - 1 year (although not all developers are likely to be on and might filter off towards the end), Our team is going to be small so this will help out a bit I believe. The team will essentially consist of: 3 x developers (1 a slightly more experienced and will be the lead) 1 x project manager / product owner / tester An external company responsbile for doing our design work General project/development decisions so far have included: Develop in an Agile way using SCRUM techniques (We are still very much learning this approach as a company) Use MVVM archectecture Use Ninject and DI where possible Attempt to use as TDD as much as possible to drive development. Keep our controllers as skinny as possible Keep our views as simple as possible During our discussions two approaches have been broached as too how to seperate the workload given our objectives outlined above. OPTION 1: A framework seperation where each person is responsible for conceptual areas with overlap and discussion primarily in the integration areas. The integration areas would the responsibily of both developers as required. View prototypes (**Graphic designer**) | - Mockups | Views (Razor and view helpers etc) & Javascript (**Developer 1**) | - View models (Integration point) | Controllers and Application logic (**Developer 2**) | - Models (Integration point) | Domain model and persistence (**Developer 3**) PROS: Integration points are quite clear and so developers can work without dependencies on others fairly easily Code practices such as naming conventions and style is more easily managed in regards to consistancy as primarily only one developer will be handling an area CONS: Completion of an entire feature becomes a bit grey as no single person is responsible for an entire feature (story?) A person might not have a full appreciation for all areas of the project and so code overlap might be lacking if suddenly that person left. OPTION 2: A more task orientated approach where each person is responsible for the completion of the entire task from view - controller - model. PROS: A person is responsible for one entire feature so it's "complete" state can be clearly defined Code overlap into different areas will occur so each individual has good coverage over the entire application CONS: Overlap of development will occur in all the modules and developers can develop/extend without a true understanding of what the original code owner was intending. This could potentially lead more easily to code bloat? Following a convention might be harder as developers are adding to all areas of the project If a developer sets up a way of doing things would it be harder to enforce the other developers to follow that convention or even build on it (or even discuss it?). Dunno.. Bugs could more easily be introduced into areas not thought about by the developer It's easier to possibly to carry a team member in so far as one member just hacks code together to complete a task whilst another takes time to build a foundation that could be used by others and so help make future tasks easier i.e. starts building a framework? QUESTION: As it might appear I'm more in favor of option 1, however I'm interested to see how others might have approached this or what is the standard or best or preferred way of undertaking a project. Or indeed any different approach to handling this?

    Read the article

  • Are python's cryptographic modules good enough?

    - by Aerovistae
    I mean, say you were writing professional grade software that would involve sensitive client information. (Take this in the context of me being an amateur programmer.) Would you use hlib and hmac? Are they good enough to secure data? Or would you write something fancier by hand? Edit: In context of those libraries containing more or less the best hashing algorithms in the world, I guess it's silly to ask if you'd "write something fancier." What I'm really asking here is whether it's enough on its own.

    Read the article

  • Comprehensive system for documentation and handoff of developer project

    - by Uzumaki Naruto
    I work on a technology team that typically develops projects for a period of time, and then hands off to other groups for long-term maintenance and improvements. My team currently uses ad hoc methods of handing off documentations, such as diagrams, API references, etc. Is there a open source solution (or even proprietary one) that enables us to manage: Infrastructure/architecture/software diagrams API documentation Directory structures/file structures Overall documentation summaries in one place? E.g., instead of using multiple systems like Swagger, Wikis, etc. - is there a solution that can seamlessly combine all of these? And enable us to generate a package including all 4 key items with one click to hand off to other teams.

    Read the article

  • How do I make my volume indicator operate in decibels instead of percentage?

    - by ethana2
    When I want to adjust the volume of anything I'm doing, I find that using the volume controls built into Ubuntu is little but confusion. When the volume is around 100%, dropping it several increments has almost no effect on apparent volume, but when it's around 0%, the effect of one click of my mouse wheel is probably a good 3 decibels. I have observed this behavior on tens of different UC's, since I convert about one Ubuntu user a month (NE team contact). This has proven so frustrating to me that I tend to use the volume knob on my guitar amp ( mono audio :| ) instead of the volume indicator. What can I do to make my volume indicator behave properly until this is fixed? I want each volume increment to be one half or one third decibel. Is there a different piece of software I should use for system volume configuration perhaps?

    Read the article

  • Automatically kill a process if it exceeds a given amount of RAM

    - by chrisamiller
    I work on large-scale datasets. When testing new software, a script will sometimes sneak up on me, quickly grab all available RAM, and render my desktop unusable. I'd like a way to set a RAM limit for a process so that if it exceeds that amount, it will be killed automatically. A language-specific solution probably won't work, as I use all sorts of different tools (R, Perl, Python, Bash, etc). So is there some sort of process-monitor that will let me set a threshold amount of RAM and automatically kill a process if it uses more?

    Read the article

  • AutoVue 20.0.x End of Oracle Premier Support

    - by GrahamOracle
    As per Oracle’s Lifetime Support policy, AutoVue version 20.0.x reached the end of Premier Support on March 1st 2012, and entered Sustaining Support. Customers are recommended to upgrade to the latest & greatest (AutoVue 20.2.0) at the earliest opportunity, to take advantage not only of a new 5-year Premier Support term, but also all of the fixes, new features, and new format support as compared to version 20.0.x.For more information on Oracle’s Lifetime Support policy, visit http://www.oracle.com/us/support/lifetime-support/lifetime-support-software-342730.html and click on the link titled “Lifetime Support Policy: Oracle Applications (PDF)”.

    Read the article

  • Force Your Mac to Sort Folders on Top of Files (Windows Style)

    - by Eric Z Goodnight
    Even die-hard Mac converts have their issues with Mac OS, and one of those problems is that OS X lists folders mixed in with all other files. Here’s how to fix that in under five minutes with a clever hack. You know you’ve had that issue. You’ve dug through your files looking for that one elusive folder, and because it’s jumbled in with all the other stuff, it’s more or less impossible to find. Have no fear, with no downloads or silly plug-in software, you can finally make Mac OS behave like Windows and Linux and list those folders in the proper order.  How To Encrypt Your Cloud-Based Drive with BoxcryptorHTG Explains: Photography with Film-Based CamerasHow to Clean Your Dirty Smartphone (Without Breaking Something)

    Read the article

  • Performance triage

    - by Dave
    Folks often ask me how to approach a suspected performance issue. My personal strategy is informed by the fact that I work on concurrency issues. (When you have a hammer everything looks like a nail, but I'll try to keep this general). A good starting point is to ask yourself if the observed performance matches your expectations. Expectations might be derived from known system performance limits, prototypes, and other software or environments that are comparable to your particular system-under-test. Some simple comparisons and microbenchmarks can be useful at this stage. It's also useful to write some very simple programs to validate some of the reported or expected system limits. Can that disk controller really tolerate and sustain 500 reads per second? To reduce the number of confounding factors it's better to try to answer that question with a very simple targeted program. And finally, nothing beats having familiarity with the technologies that underlying your particular layer. On the topic of confounding factors, as our technology stacks become deeper and less transparent, we often find our own technology working against us in some unexpected way to choke performance rather than simply running into some fundamental system limit. A good example is the warm-up time needed by just-in-time compilers in Java Virtual Machines. I won't delve too far into that particular hole except to say that it's rare to find good benchmarks and methodology for java code. Another example is power management on x86. Power management is great, but it can take a while for the CPUs to throttle up from low(er) frequencies to full throttle. And while I love "turbo" mode, it makes benchmarking applications with multiple threads a chore as you have to remember to turn it off and then back on otherwise short single-threaded runs may look abnormally fast compared to runs with higher thread counts. In general for performance characterization I disable turbo mode and fix the power governor at "performance" state. Another source of complexity is the scheduler, which I've discussed in prior blog entries. Lets say I have a running application and I want to better understand its behavior and performance. We'll presume it's warmed up, is under load, and is an execution mode representative of what we think the norm would be. It should be in steady-state, if a steady-state mode even exists. On Solaris the very first thing I'll do is take a set of "pstack" samples. Pstack briefly stops the process and walks each of the stacks, reporting symbolic information (if available) for each frame. For Java, pstack has been augmented to understand java frames, and even report inlining. A few pstack samples can provide powerful insight into what's actually going on inside the program. You'll be able to see calling patterns, which threads are blocked on what system calls or synchronization constructs, memory allocation, etc. If your code is CPU-bound then you'll get a good sense where the cycles are being spent. (I should caution that normal C/C++ inlining can diffuse an otherwise "hot" method into other methods. This is a rare instance where pstack sampling might not immediately point to the key problem). At this point you'll need to reconcile what you're seeing with pstack and your mental model of what you think the program should be doing. They're often rather different. And generally if there's a key performance issue, you'll spot it with a moderate number of samples. I'll also use OS-level observability tools to lock for the existence of bottlenecks where threads contend for locks; other situations where threads are blocked; and the distribution of threads over the system. On Solaris some good tools are mpstat and too a lesser degree, vmstat. Try running "mpstat -a 5" in one window while the application program runs concurrently. One key measure is the voluntary context switch rate "vctx" or "csw" which reflects threads descheduling themselves. It's also good to look at the user; system; and idle CPU percentages. This can give a broad but useful understanding if your threads are mostly parked or mostly running. For instance if your program makes heavy use of malloc/free, then it might be the case you're contending on the central malloc lock in the default allocator. In that case you'd see malloc calling lock in the stack traces, observe a high csw/vctx rate as threads block for the malloc lock, and your "usr" time would be less than expected. Solaris dtrace is a wonderful and invaluable performance tool as well, but in a sense you have to frame and articulate a meaningful and specific question to get a useful answer, so I tend not to use it for first-order screening of problems. It's also most effective for OS and software-level performance issues as opposed to HW-level issues. For that reason I recommend mpstat & pstack as my the 1st step in performance triage. If some other OS-level issue is evident then it's good to switch to dtrace to drill more deeply into the problem. Only after I've ruled out OS-level issues do I switch to using hardware performance counters to look for architectural impediments.

    Read the article

  • How can I keep track of all the websites I've made like a proper business would?

    - by Mile
    A few other students and I are forming a group that wants to become good at what we do: websites. We are making websites for free for friends at the moment in order to get ourselves some experience and to learn from each other. We are about to finish our first website this week. In 6 months time we plan to have a portfolio and hope to start charging for websites. The issue is that we are all beginners and we are unsure about how to keep records of the websites we do. It is important as we may want to maintain a few websites or add to them later on. How does a proper web design business keep records of all info needed? Is there a program or software package we can use?

    Read the article

  • Now Available:Oracle Utilities Customer Self Service Version 2.1

    - by Roxana Babiciu
    The Oracle Utilities Global Business Unit is pleased to announce the general availability of Oracle Utilities Customer Self Service 2.1. It is ready for customers and partners to download and install via the Oracle Software Delivery Cloud. Key Features & Benefits: Oracle Utilities Customer Self Service 2.1 includes several new capabilities and enhancements including significantly improved Commercial Account Management and Advanced Notification Management using a new Oracle Utilities Notification Center module (licensed separately). These include the following: Advanced Notification Management Online Issues and Forms Management • Budget Management and Billing for Billed Budgets Prepaid User Dashboard Enhanced Usage Details Web Presentment Start/Stop/Transfer Service Automation Payment Arrangement Automation Account Sets Management for Large Commercial Customers Multiple Account Usage Data Aggregation, Comparison, and Data Download Multiple Account Financial History Mobile Outage Maps More information can be found on OPN

    Read the article

  • Installing Ubuntu on btrfs over multiple drives

    - by Tom Ato
    When I installed Ubuntu 13.04, I managed to combine a couple of outdated askubuntu answers, as well as some of the btrfs documentation in order to figure out how to install Ubuntu over two SSDs using a single btrfs partition (I think /boot was on a small ext4 partition). I want to install Ubuntu 13.10 in a similar way, using a single btrfs partition striping data over the two SSDs, but I don't feel comfortable synthesizing a method that I am sure will work with current software. What is the best way to partition and install Ubuntu over two SSDs using btrfs, in an effectively RAID 0 way?

    Read the article

  • pwmconfig: "There are no pwm-capable sensor modules installed"

    - by Sman789
    I'm trying to reduce my fan speed with fancontrol and pwmanager because, despite the temperatures being the same, they are much louder on Linux (Ubuntu Gnome 14.04) than on Windows. I've followed the instructions in the first answer here but when running pwmanager I get pwmconfig: "There are no pwm-capable sensor modules installed" I know that my system has working thermal sensors because PSensor has no trouble telling me my CPU temp and GPU temp. I would appreciate any help you can give in helping me reduce my fan speed to that of Windows (which uses the ASUS AI Suite 3 software which came with the Z87-A motherboard, if that's relevant).

    Read the article

  • Install a Mirror without downloading all the packages in the official repository

    - by Sam
    I first gonna explain the situation : ( The two PCs are running Ubuntu 12.04 ) I have a Laptop which is connected to a wifi connection, and a Desktop which can not be connected to Internet ( the modem is too far from it ), and i want to install some software to the last one. ( the two PCs are connected with an Ethernet cable ) I've already searched for a solution, but all i found was the use of some softwares that should have been already installed on the "Internet-less PC". ( Keryx, APTonCD ... ) What I want to do is to create a mirror in my laptop which contain the packages i have in this one ( situated in /var/cache/apt/archive ) and i don't want to download all the packages from the official repository, I don't need them. Can someone tell me if this is possible ? Thank you.

    Read the article

  • Why does sound stop working after a while?

    - by badp
    I don't know how to reproduce this problem, because I don't regularly play music or sound. All I know is that, sometimes, I'll load a video (from youtube or from a local file) and there will be no sound. Everything looks fine software wise: Rebooting always fixes. aplay, paplay and pals give no error message I'm not in the audio group, as advised The device exists and appears in use: $ lsof /dev/snd/by-path/pci-0000\:00\:1b.0 COMMAND PID USER FD TYPE DEVICE SIZE/OFF NODE NAME pulseaudi 17313 badp 23u CHR 116,10 0t0 7628 /dev/snd/by-path/../controlC0 pulseaudi 17313 badp 30u CHR 116,10 0t0 7628 /dev/snd/by-path/../controlC0 Restarting pulseaudio or alsa seems to do no good. What is wrong here?

    Read the article

  • Wheres my memory going?

    - by Stu2000
    My machine keeps 'freezing' before eventaully logging out with all the programs exiting. This is rather annoying, and I think its because I keep running out of memory. I am not running any custom software, just netbeans, chrome etc. (Stuff I usually run on other ubuntu computers without issue). For some reason my memory usage is through the roof as seen here, but I can't quite figure out why. Here is a screenshot which may be useful with htop and gnome-system monitor open as user and as root. I notice that my console-kit-daemon is taking up about a gig of 'virtual memory'. Is that normal? Any tips/advice will be helpful. In the meantime I have ordered 2 x 4 gig ram sticks to try and just throw hardware at the issue.

    Read the article

  • Is the "One Description Table to rule them all" approch good?

    - by DavRob60
    Long ago, I worked (as a client) with a software which use a centralized table for it's codified element. Here, as far as I remember, how the table look like : Table_Name (PK) Field_Name (PK) Code (PK) Sort_Order Description So, instead of creating a table every time they need a codified field, they where just adding row in this table with the new Table_Name and Field_Name. I'm sometime tempted to use this pattern in some database I design, but I have resisted to this as from now, I think there's something wrong with this, but I cannot put the finger on it. It is just because you land with some of the structure logic within the Data or something else?

    Read the article

  • Continuous integration testing server: hosted, own desktop, or own server

    - by Victor
    For testing, I am planning to run a continuous integration testing. There are mainly two options: hosted, or own desktop/server. I will break it into 3 options I have: Hosted: Economical, $10-20/month for a small app Less setup, the CI company manage all hardware and software Desktop: I could just buy a simple, cheap desktop as a test server (about $500). Used server: My current office is offloading some old Dell rack server (Probably dual core Xeon, which I can purchase for $50 or less Please advise me which best serves me for a small team of 2-3 developers. Thanks.

    Read the article

< Previous Page | 465 466 467 468 469 470 471 472 473 474 475 476  | Next Page >