Search Results

Search found 4363 results on 175 pages for 'requirements gathering'.

Page 13/175 | < Previous Page | 9 10 11 12 13 14 15 16 17 18 19 20  | Next Page >

  • What are the technial and programming requirements for writing a stealth keylogger?

    - by user970533
    I'm planning to write/code one such stealth keylogger that would bypass detection by a certain antivirus. (I don't want to name the vendor as I know how good Google queries are against StackExchange websites). I don't want to just download any keylogger from internet and try to encode it to evade detection. Writing code myself I would have the ability to make changes as I go; obscuration on both high-level and low-level language. I like control too. It seems naive but is it true that keyloggers are a thing of the past, probably because of how effective AV's have become in detecting such programs? I want some nice points on how can one easily write a robust, effective key logger preferably for a Windows environment?

    Read the article

  • What payment processors fullfill this requirements and what are their pluses and minuses? [on hold]

    - by Sharen Eayrs
    Accept credit card Allow me to automatically credit a customers' account. We're not selling e-book. We're selling a credit to our account. So it's important that customers do not get credited twice. Easy to program and integrate with our sites. Have affiliate programs. Transfer money to bank accounts quickly Accept merchants from many countries No monthly fee is a plus. I am thinking of using 2co.com avangate.com clickbank.com Some people reccomend https://stripe.com/us/features (is it easy to implement) http://www.paymentwall.com/ There are so many payment processor I am very confused. Can anyone tell me pluses and minuses of those ones. or perhaps others. What would be the plus and minuses of those 3 that you know off

    Read the article

  • What is a good algorithm to distribute items with specific requirements?

    - by user66160
    I have to programmatically distribute a set of items to some entities, but there are rules both on the items and on the entities like so: Item one: 100 units, only entities from Foo Item two: 200 units, no restrictions Item three: 100 units, only entities that have Bar Entity one: Only items that have Baz Entity one hundred: No items that have Fubar I only need to be pointed in the right direction, I'll research and learn the suggested methods.

    Read the article

  • Which browsers handle `Content-Encoding: gzip` and which of them has any special requirements on encodinq quality?

    - by user1049847
    I am creating a "hand made" HTTP 1/0, 1/1 server. I recently integrated zip lib so now I can stream encoded gziped data in and out. I wonder Which major browsers (alive ones - IE6-IE10, Chrome, FF, etc) send Accept-Encoding: deflate, gzip, ... and so can handle Content-Encoding: gzip today? Which of them send any quality expectations? Which of them can send encoded gziped post request and multypart/form data to my server?

    Read the article

  • PES 2012 results in a Nvidia error on Dell N5110 regarding System Requirements?

    - by ???? ???????
    I bought laptop Dell N5110 and installed PES 2012. Nvidia graphics card has 1GB RAM, but when I try to run game message appears that there is just 128MB graphics memory. Here is the message: Warning :Your computer does not meet the Minimum System Requirements to run this software.As a result, you may experience errors during operation.Your Video Card does not meet the required specifications.(GPU:VRAM 128MB) What could be the problem?

    Read the article

  • Alignment requirements: converting basic disk to dynamic disk in order to set up software RAID?

    - by 0xC0000022L
    On Windows 7 x64 Professional I am struggling to convert a basic disk to a dynamic one. Under Disk Management in the MMC the conversion is supposed to be initiated automatically, but it doesn't. My guess: because of using third-party partitioning tools there isn't enough space in front and after the partitions (system-reserved/boot + system volume) to store the required meta-data. When demoting a dynamic disk to a basic disk manually, I noticed that some space seems to be required before and after the partitions. What are the exact alignment requirements that allow the on-board tools in Windows to do the conversion?

    Read the article

  • Why did windows change the elevation requirements of my AutoHotKey script, and how can I prevent such in the future?

    - by monsto
    I was working on an AutoHotKey (AHK) script to create prefab mouse movements for a very simple model viewer. I worked on it for a good hour. I zipped the script, posted it to a forum, and thought "oh hey, I should add bla bla bla to the script". When I returned to the program, the AHK script would not work. I could see the mouse movements working in other programs (notepad, chrome, etc), but not where I had been working the previous hour. After several hours of throwing darts at the troubleshooting wall, I discovered that the fix was to set the AHK.exe to Run as Administrator. The question here are multiple Why did Windows 7, in all it's wisdom, suddenly decide that elevation was necessary in the middle of usage? Can these permission requirements somehow be reverted by, say, removing a key from the registry or something? How can this kind of Windows behaviour be avoided in the future?

    Read the article

  • What requirements does an IT department work space need?

    - by Rob
    Hello all, i need to provide a list of workspace requirements to the IT director for my network operations team. So far I got Secure workspace - so nothing gets stolen and people cant come up to us asking for support (they need a ticket from the helpdesk) Quite area - so that we can work and not be disturbed by the loud project managers who play soccer in the office sometimes. A large table or desk where we can setup and or config systems and servers if needed. What else do we need? Thanks in advance.

    Read the article

  • How can I judge the suitability of modern processors for systems with specific CPU requirements?

    - by Iszi
    Inspired by this question: How do I calculate clock speed in multi-core processors? The answers in the above question do a fair job of explaining why a lower-speed multi-core processor won't necessarily perform at the same level as a higher-speed single-core processor. Example: 4*2=8, but a quad-core 2 GHz processor isn't necessarily as fast as a single-core 8 GHz processor. However, I'm having a hard time putting the information in those answers to practical use in my mind. Particularly, I want to know how it should be used to judge whether a given CPU is appropriate for an application with specific requirements. Example scenarios: An application has a minimum CPU requirement of 2.4 GHz dual-core. Another application has a minimum CPU requirement of 1.8 GHz single-core. For either of the above scenarios: Would a higher-speed processor with fewer cores, or a lower-speed processor with more cores, be equally sufficient? If so, how can we determine the appropriate processor speeds required for a given number of cores?

    Read the article

  • Analysing a Visual Foxpro application to derive requirements. Tools/approaches/practices?

    - by Kabeer
    Hello. In an upcoming project I am supposed to re-engineer a huge application built on Visual Foxpro into a web-application. The target platform is .Net. The application from the end-users' perspective is very complex (complex forms, reports, navigation, etc). The sorry state is that there are no documents available from which I can derive, business processes, business rules, work-flows, validations, application state, etc. I can gather some requirements from end-users but it cannot be complete from any perspective. Maneuvering through the code would be tedious & time consuming, given the millions of lines of code. Therefore I am looking for a tool that can help me in code analysis. My googling attempt didn't help me at least for a Visual Foxpro code base. Besides, I will appreciate if someone can share processes/approaches/techniques to establish the requirements as far as possible. BTW, this link didn't quite help.

    Read the article

  • The Incremental Architect&acute;s Napkin - #1 - It&acute;s about the money, stupid

    - by Ralf Westphal
    Originally posted on: http://geekswithblogs.net/theArchitectsNapkin/archive/2014/05/24/the-incremental-architectacutes-napkin---1---itacutes-about-the.aspx Software development is an economic endeavor. A customer is only willing to pay for value. What makes a software valuable is required to become a trait of the software. We as software developers thus need to understand and then find a way to implement requirements. Whether or in how far a customer really can know beforehand what´s going to be valuable for him/her in the end is a topic of constant debate. Some aspects of the requirements might be less foggy than others. Sometimes the customer does not know what he/she wants. Sometimes he/she´s certain to want something - but then is not happy when that´s delivered. Nevertheless requirements exist. And developers will only be paid if they deliver value. So we better focus on doing that. Although is might sound trivial I think it´s important to state the corollary: We need to be able to trace anything we do as developers back to some requirement. You decide to use Go as the implementation language? Well, what´s the customer´s requirement this decision is linked to? You decide to use WPF as the GUI technology? What´s the customer´s requirement? You decide in favor of a layered architecture? What´s the customer´s requirement? You decide to put code in three classes instead of just one? What´s the customer´s requirement behind that? You decide to use MongoDB over MySql? What´s the customer´s requirement behind that? etc. I´m not saying any of these decisions are wrong. I´m just saying whatever you decide be clear about the requirement that´s driving your decision. You have to be able to answer the question: Why do you think will X deliver more value to the customer than the alternatives? Customers are not interested in romantic ideals of hard working, good willing, quality focused craftsmen. They don´t care how and why you work - as long as what you deliver fulfills their needs. They want to trust you to recognize this as your top priority - and then deliver. That´s all. Fundamental aspects of requirements If you´re like me you´re probably not used to such scrutinization. You want to be trusted as a professional developer - and decide quite a few things following your gut feeling. Or by relying on “established practices”. That´s ok in general and most of the time - but still… I think we should be more conscious about our decisions. Which would make us more responsible, even more professional. But without further guidance it´s hard to reason about many of the myriad decisions we´ve to make over the course of a software project. What I found helpful in this situation is structuring requirements into fundamental aspects. Instead of one large heap of requirements then there are smaller blobs. With them it´s easier to check if a decisions falls in their scope. Sure, every project has it´s very own requirements. But all of them belong to just three different major categories, I think. Any requirement either pertains to functionality, non-functional aspects or sustainability. For short I call those aspects: Functionality, because such requirements describe which transformations a software should offer. For example: A calculator software should be able to add and multiply real numbers. An auction website should enable you to set up an auction anytime or to find auctions to bid for. Quality, because such requirements describe how functionality is supposed to work, e.g. fast or secure. For example: A calculator should be able to calculate the sinus of a value much faster than you could in your head. An auction website should accept bids from millions of users. Security of Investment, because functionality and quality need not just be delivered in any way. It´s important to the customer to get them quickly - and not only today but over the course of several years. This aspect introduces time into the “requrements equation”. Security of Investments (SoI) sure is a non-functional requirement. But I think it´s important to not subsume it under the Quality (Q) aspect. That´s because SoI has quite special properties. For one, SoI for software means something completely different from what it means for hardware. If you buy hardware (a car, a hair blower) you find that a worthwhile investment, if the hardware does not change it´s functionality or quality over time. A car still running smoothly with hardly any rust spots after 10 years of daily usage would be a very secure investment. So for hardware (or material products, if you like) “unchangeability” (in the face of usage) is desirable. With software you want the contrary. Software that cannot be changed is a waste. SoI for software means “changeability”. You want to be sure that the software you buy/order today can be changed, adapted, improved over an unforseeable number of years so as fit changes in its usage environment. But that´s not the only reason why the SoI aspect is special. On top of changeability[1] (or evolvability) comes immeasurability. Evolvability cannot readily be measured by counting something. Whether the changeability is as high as the customer wants it, cannot be determined by looking at metrics like Lines of Code or Cyclomatic Complexity or Afferent Coupling. They may give a hint… but they are far, far from precise. That´s because of the nature of changeability. It´s different from performance or scalability. Also it´s because a customer cannot tell upfront, “how much” evolvability he/she wants. Whether requirements regarding Functionality (F) and Q have been met, a customer can tell you very quickly and very precisely. A calculation is missing, the calculation takes too long, the calculation time degrades with increased load, the calculation is accessible to the wrong users etc. That´s all very or at least comparatively easy to determine. But changeability… That´s a whole different thing. Nevertheless over time the customer will develop a feedling if changeability is good enough or degrading. He/she just has to check the development of the frequency of “WTF”s from developers ;-) F and Q are “timeless” requirement categories. Customers want us to deliver on them now. Just focusing on the now, though, is rarely beneficial in the long run. So SoI adds a counterweight to the requirements picture. Customers want SoI - whether they know it or not, whether they state if explicitly or not. In closing A customer´s requirements are not monolithic. They are not all made the same. Rather they fall into different categories. We as developers need to recognize these categories when confronted with some requirement - and take them into account. Only then can we make true professional decisions, i.e. conscious and responsible ones. I call this fundamental trait of software “changeability” and not “flexibility” to distinguish to whom it´s a concern. “Flexibility” to me means, software as is can easily be adapted to a change in its environment, e.g. by tweaking some config data or adding a library which gets picked up by a plug-in engine. “Flexibiltiy” thus is a matter of some user. “Changeability”, on the other hand, to me means, software can easily be changed in its structure to adapt it to new requirements. That´s a matter of the software developer. ?

    Read the article

  • Concise C# code for gathering several properties with a non-null value into a collection?

    - by stakx
    A fairly basic problem for a change. Given a class such as this: public class X { public T A; public T B; public T C; ... // (other fields, properties, and methods are not of interest here) } I am looking for a concise way to code a method that will return all A, B, C, ... that are not null in an enumerable collection. (Assume that declaring these fields as an array is not an option.) public IEnumerable<T> GetAllNonNullAs(this X x) { // ? } The obvious implementation of this method would be: public IEnumerable<T> GetAllNonNullAs(this X x) { var resultSet = new List<T>(); if (x.A != null) resultSet.Add(x.A); if (x.B != null) resultSet.Add(x.B); if (x.C != null) resultSet.Add(x.C); ... return resultSet; } What's bothering me here in particular is that the code looks verbose and repetitive, and that I don't know the initial List capacity in advance. It's my hope that there is a more clever way, probably something involving the ?? operator? Any ideas?

    Read the article

  • MySQL/PHP: How to insert logged in user id into another table that is gathering data from a form tha

    - by Lisa
    For the first time I am needing to join information from two tables and am quite nervous about doing it without any advice first. Basically, I am building a secure site that is accessed by authorised users. I have my login table with user_id, username, password Once the user is on the site, they have the option of inputting data into another table called input. At the moment this table only captures the information that is entered, not the user_id or username of the inputter. I would like the form to be able to input the user_id and/or username from the login table into the input table. Please could somebody talk me through this process? I am sure that once this is amended, I will then be able to use the table to only allow the logged in user to access the information that he or she have inputted, is that correct? Many thanks

    Read the article

  • Is this time related process accounting stats gathering appropriate?

    - by Ceko Cakata
    Based on sys/acct.h (V1, not V3) I need to gather some user usage statistics based on a parser that parser the acct file line by line. The parser will run and parse the entire file every N seconds and I need to gather user statistics accumulated since the last run (N seconds back). I'm not sure what will be the most appropriate way to do it based on the info provided by sys/acct.h. Maybe something like this: if ((ac_btime + ac_etime) < (current_time - N)) { gather; } Also comp_t is said to be "floating-point value consisting of a 3-bit, base-8 exponent, and a 13-bit mantissa", but I think u_int16_t is just a unsigned short int. Should I be converting it to long it with the provided formula or not?

    Read the article

  • The Incremental Architect&acute;s Napkin - #2 - Balancing the forces

    - by Ralf Westphal
    Originally posted on: http://geekswithblogs.net/theArchitectsNapkin/archive/2014/06/02/the-incremental-architectacutes-napkin---2---balancing-the-forces.aspxCategorizing requirements is the prerequisite for ecconomic architectural decisions. Not all requirements are created equal. However, to truely understand and describe the requirement forces pulling on software development, I think further examination of the requirements aspects is varranted. Aspects of Functionality There are two sides to Functionality requirements. It´s about what a software should do. I call that the Operations it implements. Operations are defined by expressions and control structures or calls to frameworks of some sort, i.e. (business) logic statements. Operations calculate, transform, aggregate, validate, send, receive, load, store etc. Operations are about behavior; they take input and produce output by considering state. I´m not using the term “function” here, because functions - or methods or sub-programs - are not necessary to implement Operations. Functions belong to a different sub-aspect of requirements (see below). Operations alone are not enough, though, to make a customer happy with regard to his/her Functionality requirements. Only correctly implemented Operations provide full value. This should make clear, why testing is so important. And not just manual tests during development of some operational feature, but automated tests. Because only automated tests scale when over time the number of operations increases. Without automated tests there is no guarantee formerly correct operations are still correct after more got added. To retest all previous operations manually is infeasible. So whoever relies just on manual tests is not really balancing the two forces Operations and Correctness. With manual tests more weight is put on the side of the scale of Operations. That might be ok for a short period of time - but in the long run it will bite you. You need to plan for Correctness in the long run from the first day of your project on. Aspects of Quality As important as Functionality is, it´s not the driver for software development. No software has ever been written to just implement some operation in code. We don´t need computers just to do something. All computers can do with software we can do without them. Well, at least given enough time and resources. We could calculate the most complex formulas without computers. We could do auctions with millions of people without computers. The only reason we want computers to help us with this and a million other Operations is… We don´t want to wait for the results very long. Or we want less errors. Or we want easier accessability to complicated solutions. So the main reason for customers to buy/order software is some Quality. They want some Functionality with a higher Quality (e.g. performance, scalability, usability, security…) than without the software. But Qualities come in at least two flavors: Most important are Primary Qualities. That´s the Qualities software truely is written for. Take an online auction website for example. Its Primary Qualities are performance, scalability, and usability, I´d say. Auctions should come within reach of millions of people; setting up an auction should be very easy; finding a suitable auction and bidding on it should be as fast as possible. Only if those Qualities have been implemented does security become relevant. A secure auction website is important - but not as important as a fast auction website. Nobody would want to use the most secure auction website if it was unbearably slow. But there would be people willing to use the fastest auction website even it was lacking security. That´s why security - with regard to online auction software - is not a Primary Quality, but just a Secondary Quality. It´s a supporting quality, so to speak. It does not deliver value by itself. With a password manager software this might be different. There security might be a Primary Quality. Please get me right: I don´t want to denigrate any Quality. There´s a long list of non-functional requirements at Wikipedia. They are all created equal - but that does not mean they are equally important for all software projects. When confronted with Quality requirements check with the customer which are primary and which are secondary. That will help to make good economical decisions when in a crunch. Resources are always limited - but requirements are a bottomless ocean. Aspects of Security of Investment Functionality and Quality are traditionally the requirement aspects cared for most - by customers and developers alike. Even today, when pressure rises in a project, tunnel vision will focus on them. Any measures to create and hold up Security of Investment (SoI) will be out of the window pretty quickly. Resistance to customers and/or management is futile. As long as SoI is not placed on equal footing with Functionality and Quality it´s bound to suffer under pressure. To look closer at what SoI means will help to become more conscious about it and make customers and management aware of the risks of neglecting it. SoI to me has two facets: Production Efficiency (PE) is about speed of delivering value. Customers like short response times. Short response times mean less money spent. So whatever makes software development faster supports this requirement. This must not lead to duct tape programming and banging out features by the dozen, though. Because customers don´t just want Operations and Quality, but also Correctness. So if Correctness gets compromised by focussing too much on Production Efficiency it will fire back. Customers want PE not just today, but over the whole course of a software´s lifecycle. That means, it´s not just about coding speed, but equally about code quality. If code quality leads to rework the PE is on an unsatisfactory level. Also if code production leads to waste it´s unsatisfactory. Because the effort which went into waste could have been used to produce value. Rework and waste cost money. Rework and waste abound, however, as long as PE is not addressed explicitly with management and customers. Thanks to the Agile and Lean movements that´s increasingly the case. Nevertheless more could and should be done in many teams. Each and every developer should keep in mind that Production Efficiency is as important to the customer as Functionality and Quality - whether he/she states it or not. Making software development more efficient is important - but still sooner or later even agile projects are going to hit a glas ceiling. At least as long as they neglect the second SoI facet: Evolvability. Delivering correct high quality functionality in short cycles today is good. But not just any software structure will allow this to happen for an indefinite amount of time.[1] The less explicitly software was designed the sooner it´s going to get stuck. Big ball of mud, monolith, brownfield, legacy code, technical debt… there are many names for software structures that have lost the ability to evolve, to be easily changed to accomodate new requirements. An evolvable code base is the opposite of a brownfield. It´s code which can be easily understood (by developers with sufficient domain expertise) and then easily changed to accomodate new requirements. Ideally the costs of adding feature X to an evolvable code base is independent of when it is requested - or at least the costs should only increase linearly, not exponentially.[2] Clean Code, Agile Architecture, and even traditional Software Engineering are concerned with Evolvability. However, it seems no systematic way of achieving it has been layed out yet. TDD + SOLID help - but still… When I look at the design ability reality in teams I see much room for improvement. As stated previously, SoI - or to be more precise: Evolvability - can hardly be measured. Plus the customer rarely states an explicit expectation with regard to it. That´s why I think, special care must be taken to not neglect it. Postponing it to some large refactorings should not be an option. Rather Evolvability needs to be a core concern for every single developer day. This should not mean Evolvability is more important than any of the other requirement aspects. But neither is it less important. That´s why more effort needs to be invested into it, to bring it on par with the other aspects, which usually are much more in focus. In closing As you see, requirements are of quite different kinds. To not take that into account will make it harder to understand the customer, and to make economic decisions. Those sub-aspects of requirements are forces pulling in different directions. To improve performance might have an impact on Evolvability. To increase Production Efficiency might have an impact on security etc. No requirement aspect should go unchecked when deciding how to allocate resources. Balancing should be explicit. And it should be possible to trace back each decision to a requirement. Why is there a null-check on parameters at the start of the method? Why are there 5000 LOC in this method? Why are there interfaces on those classes? Why is this functionality running on the threadpool? Why is this function defined on that class? Why is this class depending on three other classes? These and a thousand more questions are not to mean anything should be different in a code base. But it´s important to know the reason behind all of these decisions. Because not knowing the reason possibly means waste and having decided suboptimally. And how do we ensure to balance all requirement aspects? That needs practices and transparency. Practices means doing things a certain way and not another, even though that might be possible. We´re dealing with dangerous tools here. Like a knife is a dangerous tool. Harm can be done if we use our tools in just any way at the whim of the moment. Over the centuries rules and practices have been established how to use knifes. You don´t put them in peoples´ legs just because you´re feeling like it. You hand over a knife with the handle towards the receiver. You might not even be allowed to cut round food like potatos or eggs with it. The same should be the case for dangerous tools like object-orientation, remote communication, threads etc. We need practices to use them in a way so requirements are balanced almost automatically. In addition, to be able to work on software as a team we need transparency. We need means to share our thoughts, to work jointly on mental models. So far our tools are focused on working with code. Testing frameworks, build servers, DI containers, intellisense, refactoring support… That´s all nice and well. I don´t want to miss any of that. But I think it´s not enough. We´re missing mental tools, tools for making thinking and talking about software (independently of code) easier. You might think, enough of such tools already exist like all those UML diagram types or Flow Charts. But then, isn´t it strange, hardly any team is using them to design software? Or is that just due to a lack of education? I don´t think so. It´s a matter value/weight ratio: the current mental tools are too heavy weight compared to the value they deliver. So my conclusion is, we need lightweight tools to really be able to balance requirements. Software development is complex. We need guidance not to forget important aspects. That´s like with flying an airplane. Pilots don´t just jump in and take off for their destination. Yes, there are times when they are “flying by the seats of their pants”, when they are just experts doing thing intuitively. But most of the time they are going through honed practices called checklist. See “The Checklist Manifesto” for very enlightening details on this. Maybe then I should say it like this: We need more checklists for the complex businss of software development.[3] But that´s what software development mostly is about: changing software over an unknown period of time. It needs to be corrected in order to finally provide promised operations. It needs to be enhanced to provide ever more operations and qualities. All this without knowing when it´s going to stop. Probably never - until “maintainability” hits a wall when the technical debt is too large, the brownfield too deep. Software development is not a sprint, is not a marathon, not even an ultra marathon. Because to all this there is a foreseeable end. Software development is like continuously and foreever running… ? And sometimes I dare to think that costs could even decrease over time. Think of it: With each feature a software becomes richer in functionality. So with each additional feature the chance of there being already functionality helping its implementation increases. That should lead to less costs of feature X if it´s requested later than sooner. X requested later could stand on the shoulders of previous features. Alas, reality seems to be far from this despite 20+ years of admonishing developers to think in terms of reusability.[1] ? Please don´t get me wrong: I don´t want to bog down the “art” of software development with heavyweight practices and heaps of rules to follow. The framework we need should be lightweight. It should not stand in the way of delivering value to the customer. It´s purpose is even to make that easier by helping us to focus and decreasing waste and rework. ?

    Read the article

  • What are the requirements for running .net application on windows ce5?

    - by citronas
    What are the requirements for running .net framework targeted application on Windows CE 5? When I try to start an application that I developed for Windows Mobile 6, I'm getting an error Cannot find 'Projectname' (or one of its components) Make sure the path and filename are correct and that all the required libraries are available My app is a simple hello world app. What could cause that such error message? Is there a way to dertimine which .net compact framework version is running on my system?

    Read the article

  • What are the requirements for an application health monitoring system?

    - by Steven A. Lowe
    What, at a minimum, should an application health-monitoring system do for you (the developer) and/or your boss (the IT Manager) and/or the oeprations (on-call) staff? What else should it do above the minimum requirements? Is monitoring the 'infrastructure' applications (ms-exchange, apache, etc.) sufficient or do individual user applications, web sites, and databases also need to be monitored? if the latter, what do you need to know about them? ADDENDUM: thanks for the input, i was really looking for application-level monitoring not infrastructure monitoring, but it is good to know about both

    Read the article

  • can QuickGraph support these requirements? (includes database persistence support)

    - by Greg
    Hi, Would QuickGraph be able to help me out with my requirements below? (a) want to model a graph of nodes and directional relationships between nodes - for example to model web pages/files linked under a URL, or modeling IT infrastructure and dependencies between hardware/software. The library would include methods such as * Node.GetDirectParents() //i.e. there could be more than one direct parent for a node * Node.GetRootParents() //i.e. traverse the tree to the top root parent(s) for the given node * Node.GetDirectChildren() * Node.GetAllChildren() (b) have to persist the data to a database - so it should support SQL Server and ideally SQLite as well. If it does support these requirement then I'd love to hear: any pointers to any parts of QuickGraph to dig into? what is the best concept re it's usage in terms of how to use database persistence - is it a simpler design to assume every search/method works directly on the database, or does QuickGraph support smarts to be able to work in memory and the "save" to database all changes at an appropriate point in time (e.g. like ADO.net does with DataTable etc) Thanks in advance

    Read the article

  • Would this union work if char had stricter alignment requirements than int?

    - by paxdiablo
    Recently I came across the following snippet, which is an attempt to ensure all bytes of i (nad no more) are accessible as individual elements of c: union { int i; char c[sizeof(int)]; }; Now this seems a good idea, but I wonder if the standard allows for the case where the alignment requirements for char are more restrictive than that for int. In other words, is it possible to have a four-byte int which is required to be aligned on a four-byte boundary with a one-byte char (it is one byte, by definition, see below) required to be aligned on a sixteen-byte boundary? And would this stuff up the use of the union above? Two things to note. I'm talking specifically about what the standard allows here, not what a sane implementor/architecture would provide. I'm using the term "byte" in the ISO C sense, where it's the width of a char, not necessarily 8 bits.

    Read the article

  • Is there a way to only have StyleCop documenation requirements SA1600 on public methods/properties?

    - by AlSki
    I've been using XMLDoc for a few years now, and have definitely grown into the mindset for supplying quality documentation for public methods and properties. However under StyleCop (and particularly its Resharper highlighting) I've noticed that the documentation requirements apply to identically to public, internal, protected and private methods. This seems a little counter-intuative to me, so I would ideally like to suppress it down to suggestions at least for private methods. Unfortunately it does seem as if the suppress setting is only across all public, internal, private, etc. Am I missing something or is this by design?

    Read the article

  • Setting up a home server - what to use? (ZFS vs btrfs, BSD vs Linux, misc other requirements)

    - by monch1962
    I need to get all our home content off individual machines and onto a central server. What I'd like to have is the metaphorical "server under the stairs". Stuff we need: expandable storage. I want to be able to add extra disc as we go along, with minimal maintenance required. Currently we've got about 3Tb of files we need to host, and that's likely to grow by another Tb every 6-12 months based on recent history. I need to be able to add additional disc with minimal pain needs to store all the media (i.e. photos, video, music) we have, and run services to serve the various devices we have in the house to playback (e.g. DAAP so we can play stuff through iTunes, ccxstream so we can play stuff over XBMC). DAAP and ccxstream are needed now, but we also need to support new standards as they emerge (so a closed-box solution isn't going to work) RAID 5, or something broadly equivalent (e.g. RAID-Z) BitTorrent client ssh, NFS, Samba access snapshot capability (as in ZFS), so we can snapshot individual file systems regularly and rollback when my kids delete their school assignments the day before they're due... ability to recover quickly from power outages (it's not unusual for us to have power outages that last longer than our UPS' batteries) FOSS software a modern distributed version control system running on the box, such as Mercurial Stuff I'd like to have on the server, but can live without: PVR capability, so I could record TV to the box Web server. We currently run a small Web server on a very old box, and I'd ideally like to turn the old box off and move the content to the new server just to save some electricity Nagios + mrtg I've been looking at using a EEE Box as the server, primarily because I can get them cheap and they don't consume much power. The choice of OS and file system is more difficult, from what I've found: I've got most experience with various Linux distros, but am happy to use another Unix FreeBSD and OpenSolaris seem to be the best choices for hosting ZFS OpenSolaris' hardware support is nowhere near as good as e.g. Ubuntu btrfs, while looking very good, doesn't seem ready for prime-time yet ZFS doesn't let you (easily?) add new discs to a RAID5 or RAID-Z reading around, it seems that ZFS is a bit short of tools for recovering lost data At the moment, I'm leaning towards running FreeNAS+ZFS, but I'm concerned about the requirement to be able to add new disc on a fairly regular basis to an existing RAID-Z. Can anyone provide some recommendations, or share experiences? Thanks in advance

    Read the article

  • Setting up a home server - what to use? (ZFS vs btrfs, BSD vs Linux, misc other requirements)

    - by monch1962
    I need to get all our home content off individual machines and onto a central server. What I'd like to have is the metaphorical "server under the stairs". Stuff we need: expandable storage. I want to be able to add extra disc as we go along, with minimal maintenance required. Currently we've got about 3Tb of files we need to host, and that's likely to grow by another Tb every 6-12 months based on recent history. I need to be able to add additional disc with minimal pain needs to store all the media (i.e. photos, video, music) we have, and run services to serve the various devices we have in the house to playback (e.g. DAAP so we can play stuff through iTunes, ccxstream so we can play stuff over XBMC). DAAP and ccxstream are needed now, but we also need to support new standards as they emerge (so a closed-box solution isn't going to work) RAID 5, or something broadly equivalent (e.g. RAID-Z) BitTorrent client ssh, NFS, Samba access snapshot capability (as in ZFS), so we can snapshot individual file systems regularly and rollback when my kids delete their school assignments the day before they're due... ability to recover quickly from power outages (it's not unusual for us to have power outages that last longer than our UPS' batteries) FOSS software a modern distributed version control system running on the box, such as Mercurial Stuff I'd like to have on the server, but can live without: PVR capability, so I could record TV to the box Web server. We currently run a small Web server on a very old box, and I'd ideally like to turn the old box off and move the content to the new server just to save some electricity Nagios + mrtg I've been looking at using a EEE Box as the server, primarily because I can get them cheap and they don't consume much power. The choice of OS and file system is more difficult, from what I've found: I've got most experience with various Linux distros, but am happy to use another Unix FreeBSD and OpenSolaris seem to be the best choices for hosting ZFS OpenSolaris' hardware support is nowhere near as good as e.g. Ubuntu btrfs, while looking very good, doesn't seem ready for prime-time yet ZFS doesn't let you (easily?) add new discs to a RAID5 or RAID-Z reading around, it seems that ZFS is a bit short of tools for recovering lost data At the moment, I'm leaning towards running FreeNAS+ZFS, but I'm concerned about the requirement to be able to add new disc on a fairly regular basis to an existing RAID-Z. Can anyone provide some recommendations, or share experiences? Thanks in advance

    Read the article

  • Setting up a home server - what to use? (ZFS vs btrfs, BSD vs Linux, misc other requirements)

    - by monch1962
    I need to get all our home content off individual machines and onto a central server. What I'd like to have is the metaphorical "server under the stairs". Stuff we need: expandable storage. I want to be able to add extra disc as we go along, with minimal maintenance required. Currently we've got about 3Tb of files we need to host, and that's likely to grow by another Tb every 6-12 months based on recent history. I need to be able to add additional disc with minimal pain needs to store all the media (i.e. photos, video, music) we have, and run services to serve the various devices we have in the house to playback (e.g. DAAP so we can play stuff through iTunes, ccxstream so we can play stuff over XBMC). DAAP and ccxstream are needed now, but we also need to support new standards as they emerge (so a closed-box solution isn't going to work) RAID 5, or something broadly equivalent (e.g. RAID-Z) BitTorrent client ssh, NFS, Samba access snapshot capability (as in ZFS), so we can snapshot individual file systems regularly and rollback when my kids delete their school assignments the day before they're due... ability to recover quickly from power outages (it's not unusual for us to have power outages that last longer than our UPS' batteries) FOSS software a modern distributed version control system running on the box, such as Mercurial Stuff I'd like to have on the server, but can live without: PVR capability, so I could record TV to the box Web server. We currently run a small Web server on a very old box, and I'd ideally like to turn the old box off and move the content to the new server just to save some electricity Nagios + mrtg I've been looking at using a EEE Box as the server, primarily because I can get them cheap and they don't consume much power. The choice of OS and file system is more difficult, from what I've found: I've got most experience with various Linux distros, but am happy to use another Unix FreeBSD and OpenSolaris seem to be the best choices for hosting ZFS OpenSolaris' hardware support is nowhere near as good as e.g. Ubuntu btrfs, while looking very good, doesn't seem ready for prime-time yet ZFS doesn't let you (easily?) add new discs to a RAID5 or RAID-Z reading around, it seems that ZFS is a bit short of tools for recovering lost data At the moment, I'm leaning towards running FreeNAS+ZFS, but I'm concerned about the requirement to be able to add new disc on a fairly regular basis to an existing RAID-Z. Can anyone provide some recommendations, or share experiences? Thanks in advance

    Read the article

< Previous Page | 9 10 11 12 13 14 15 16 17 18 19 20  | Next Page >