Search Results

Search found 922 results on 37 pages for 'patrick skiba'.

Page 4/37 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • To ORM or Not to ORM. That is the question&hellip;

    - by Patrick Liekhus
    UPDATE:  Thanks for the feedback and comments.  I have adjusted my table below with your recommendations.  I had missed a point or two. I wanted to do a series on creating an entire project using the EDMX XAF code generation and the SpecFlow BDD Easy Test tools discussed in my earlier posts, but I thought it would be appropriate to start with a simple comparison and reasoning on why I choose to use these tools. Let’s start by defining the term ORM, or Object-Relational Mapping.  According to Wikipedia it is defined as the following: Object-relational mapping (ORM, O/RM, and O/R mapping) in computer software is a programming technique for converting data between incompatible type systems in object-oriented programming languages. This creates, in effect, a "virtual object database" that can be used from within the programming language. Why should you care?  Basically it allows you to map your business objects in code to their persistence layer behind them. And better yet, why would you want to do this?  Let me outline it in the following points: Development speed.  No more need to map repetitive tasks query results to object members.  Once the map is created the code is rendered for you. Persistence portability.  The ORM knows how to map SQL specific syntax for the persistence engine you choose.  It does not matter if it is SQL Server, Oracle and another database of your choosing. Standard/Boilerplate code is simplified.  The basic CRUD operations are consistent and case use database metadata for basic operations. So how does this help?  Well, let’s compare some of the ORM tools that I have used and/or researched.  I have been interested in ORM for some time now.  My ORM of choice for a long time was NHibernate and I still believe it has a strong case in some business situations.  However, you have to take business considerations into account and the law of diminishing returns.  Because of these two factors, my recent activity and experience has been around DevExpress eXpress Persistence Objects (XPO).  The primary reason for this is because they have the DevExpress eXpress Application Framework (XAF) that sits on top of XPO.  With this added value, the data model can be created (either database first of code first) and the Web and Windows client can be created from these maps.  While out of the box they provide some simple list and detail screens, you can verify easily extend and modify these to your liking.  DevExpress has done a tremendous job of providing enough framework while also staying out of the way when you need to extend it.  This sounds worse than it really is.  What I mean by this is that if you choose to follow DevExpress coding style and recommendations, the hooks and extension points provided allow you to do some pretty heavy lifting while also not worrying about the basics. I have put together a list of the top features that I have used to compare the limited list of ORM’s that I have exposure with.  Again, the biggest selling point in my opinion is that XPO is just a solid as any of the other ORM’s but with the added layer of XAF they become unstoppable.  And then couple that with the EDMX modeling tools and code generation, it becomes a no brainer. Designer Features Entity Framework NHibernate Fluent w/ Nhibernate Telerik OpenAccess DevExpress XPO DevExpress XPO/XAF plus Liekhus Tools Uses XML to map relationships - Yes - - -   Visual class designer interface Yes - - - - Yes Management integrated w/ Visual Studio Yes - - Yes - Yes Supports schema first approach Yes - - Yes - Yes Supports model first approach Yes - - Yes Yes Yes Supports code first approach Yes Yes Yes Yes Yes Yes Attribute driven coding style Yes - Yes - Yes Yes                 I have a very small team and limited resources with a lot of responsibilities.  In order to keep up with our customers, we must rely on tools like these.  We use the EDMX tool so that we can create a visual representation of the applications with our customers.  Second, we rely on the code generation so that we can focus on the business problems at hand and not whether a field is mapped correctly.  This keeps us from requiring as many junior level developers on our team.  I have also worked on multiple teams where they believed in writing their own “framework”.  In my experiences and opinion this is not the route to take unless you have a team dedicated to supporting just the framework.  Each time that I have worked on custom frameworks, the framework eventually becomes old, out dated and full of “performance” enhancements specific to one or two requirements.  With an ORM, there are a lot smarter people than me working on the bigger issue of persistence and performance.  Again, my recommendation would be to use an available framework and get to working on your business domain problems.  If your coding is not making money for you, why are you working on it?  Do you really need to be writing query to object member code again and again? Thanks

    Read the article

  • Setup and Use SpecFlow BDD with DevExpress XAF

    - by Patrick Liekhus
    Let’s get started with using the SpecFlow BDD syntax for writing tests with the DevExpress XAF EasyTest scripting syntax.  In order for this to work you will need to download and install the prerequisites listed below.  Once they are installed follow the steps outlined below and enjoy. Prerequisites Install the following items: DevExpress eXpress Application Framework (XAF) found here SpecFlow found here Liekhus BDD/XAF Testing library found here Assumptions I am going to assume at this point that you have created your XAF application and have your Module, Win.Module and Win ready for usage.  You should have also set any attributes and/or settings as you see fit. Setup So where to start. Create a new testing project within your solution. I typically call this with a similar naming convention as used by XAF, my project name .FunctionalTests (i.e. AlbumManager.FunctionalTests). Add the following references to your project.  It should look like the reference list below. DevExpress.Data.v11.x DevExpress.Persistent.Base.v11.x DevExpress.Persistent.BaseImpl.v11.x DevExpress.Xpo.v11.2 Liekhus.Testing.BDD.Core Liekhus.Testing.BDD.DevExpress TechTalk.SpecFlow TestExecutor.v11.x (found in %Program Files%\DevExpress 2011.x\eXpressApp Framework\Tools\EasyTest Right click the TestExecutor reference and set the “Copy Local” setting to True.  This forces the TestExecutor executable to be available in the bin directory which is where the EasyTest script will be executed further down in the process. Add an Application Configuration File (app.config) to your test application.  You will need to make a few modifications to have SpecFlow generate Microsoft style unit tests.  First add the section handler for SpecFlow and then set your choice of testing framework.  I prefer MS Tests for my projects. Add the EasyTest configuration file to your project.  Add a new XML file and call it Config.xml. Open the properties window for the Config.xml file and set the “Copy to Ouput Directory” to “Copy Always”. You will setup the Config file according to the specifications of the EasyTest library my mapping to your executable and other settings.  You can find the details for the configuration of EasyTest here.  My file looks like this Create a new folder in your test project called “StepDefinitions”.  Add a new SpecFlow Step Definition file item under the StepDefinitions folder.  I typically call this class StepDefinition.cs. Have your step definition inherit from the Liekhus.Testing.BDD.DevExpress.StepDefinition class.  This will give you the default behaviors for your test in the next section. OK.  Now that we have done this series of steps, we will work on simplifying this.  This is an early preview of this new project and is not fully ready for consumption.  If you would like to experiment with it, please feel free.  Our goals are to make this a installable project on it’s own with it’s own project templates and default settings.  This will be coming in later versions.  Currently this project is in Alpha release. Let’s write our first test Remove the basic test that is created for you. We will not use the default test but rather create our own SpecFlow “Feature” files. Add a new item to your project and select the SpecFlow Feature file under C#. Name your feature file as you do your class files after the test they are performing. Writing a feature file uses the Cucumber syntax of Given… When… Then.  Think of it in these terms.  Givens are the pre-conditions for the test.  The Whens are the actual steps for the test being performed.  The Thens are the verification steps that confirm your test either passed or failed.  All of these steps are generated into a an EasyTest format and executed against your XAF project.  You can find more on the Cucumber syntax by using the Secret Ninja Cucumber Scrolls.  This document has several good styles of tests, plus you can get your fill of Chuck Norris vs Ninjas.  Pretty humorous document but full of great content. My first test is going to test the entry of a new Album into the application and is outlined below. The Feature section at the top is more for your documentation purposes.  Try to be descriptive of the test so that it makes sense to the next person behind you.  The Scenario outline is described in the Ninja Scrolls, but think of it as test template.  You can write one test outline and have multiple datasets (Scenarios) executed against that test.  Here are the steps of my test and their descriptions Given I am starting a new test – tells our test to create a new EasyTest file And (Given) the application is open – tells EasyTest to open our application defined in the Config.xml When I am at the “Albums” screen – tells XAF to navigate to the Albums list view And (When) I click the “New:Album” button – tells XAF to click the New Album button on the ribbon And (When) I enter the following information – tells XAF to find the field on the screen and put the value in that field And (When) I click the “Save and Close” button – tells XAF to click the “Save and Close” button on the detail window Then I verify results as “user” – tells the testing framework to execute the EasyTest as your configured user Once you compile and prepare your tests you should see the following in your Test View.  For each of your CreateNewAlbum lines in your scenarios, you will see a new test ready to execute. From here you will use your testing framework of choice to execute the test.  This in turn will execute the EasyTest framework to call back into your XAF application and test your business application. Again, please remember that this is an early preview and we are still working out the details.  Please let us know if you have any comments/questions/concerns. Thanks and happy testing.

    Read the article

  • /users/tags should contain scores

    - by Sean Patrick Floyd
    I am implementing some simple JavaScript/bookmarklet based apps that show some reputation info, including the score in the User's top tags (roughly based on this previous bookmarklet of mine). Now I can get a user's top tags (using the API), and I can also get the per-tag score if the user is logged in, by dynamically parsing the tag's top users page. But it costs me one AJAX request per tag and I have to download 10+k to extract a single numeric value. It would save a lot of traffic if the tags in <api>/users/<userid>/tags had a score field. The data seems to be there, after all the top users pages use it, so it would just be a question of exposing the data. Suggested structure: "tags": [ { "name": { "description": "name of the tag", "values": "string", "optional": false, "suggested_buffer_size": 25 }, "score": { "description": "tag score, sum of up votes for answers on non-wiki questions", "values": "32-bit signed integer", "optional": false }, "count": { "description": "tag count, exact meaning depends on context", "values": "32-bit signed integer", "optional": false }, "restricted_to": { "description": "user types that can make use of this tag, lack of this field indicates it is useable by all", "values": "one of anonymous, unregistered, registered, or moderator", "optional": true }, "fulfills_required": { "description": "indicates whether this tag is one of those that is required to be on a post", "values": "boolean", "optional": false }, "user_id": { "description": "user associated with this tag, depends on context", "values": "32-bit signed integer", "optional": true } } ]

    Read the article

  • IASA Kansas City to host discussion on Google Fiber Project in Kansas City

    - by Patrick Liekhus
    One of the groups that I am currently President of (IASA Kansas City) is hosting an event by Rachel Hack (Google Community Manager) about the Google Fiber Project in Kansas City.  The event will be hosted at Balance Point’s office off 92nd and Ward Parkway on the Missouri side of the state line.  If you are interested, please check out further details here and get registered.  It is after work hours from 6 to 8 PM on the night of November 29, 2011.  It is free to attend and open to anyone who gets registered.  Come one, come all and bring your friends. Thanks

    Read the article

  • Useful Sharepoint Goodies

    - by Patrick Olurotimi Ige
    I came across this list of very interesting stuff below (and it could save lots for time) 1. Faceted Search: http://facetedsearch.codeplex.com/ 2. Podcasting Kit for SharePoint: http://pks.codeplex.com/ 3. Knowledge Base: http://spkb.codeplex.com/ 4. SharePoint Branding Tool: http://brandingtool.codeplex.com/ 5. SharePoint User Account Control: http://spuac.codeplex.com/ 6. SharePoint Enhanced Calendar: http://spenhancedcalendar.codeplex.com/ 7. Enhanced Discussion Board: http://edb.codeplex.com/ 8. Wildcard Search: http://spwildcardsearch.codeplex.com/ 9. SharePoint Usage Logging Kit: http://sulk.codeplex.com/ 10. SharePoint Zip: http://sharepointzip.codeplex.com/ 11. Facebook Kit for SharePoint: http://fks.codeplex.com/ 12. Short Messages: http://spmessaging.codeplex.com/ 13. Color coded calendar: http://planetwilson.codeplex.com/Release/ProjectReleases.aspx?ReleaseId=11814 14. Most Popular Pages on SharePoint: http://popularpages.codeplex.com/   Thanks to my two bits  heput the list together

    Read the article

  • rsnapshot intervals in configuration file…

    - by Patrick
    A simple question about rsnapshot. In order to perform daily backups I'm going to add lines to cron in my Ubuntu. Then, why do I have also these lines in the rsnapshot.conf ? ######################################### # BACKUP INTERVALS # # Must be unique and in ascending order # # i.e. hourly, daily, weekly, etc. # ######################################### interval hourly 6 interval daily 7 interval weekly 4 #interval monthly 3 If I use cron, should I disable them ? thanks ps. I've just realized that in the crontab I still have "hourly" and "daily". Should I then uncomment only the one I use in the crontab ? And what's the point to specify hourly if it is already specified in cron ? I'm a bit confused. # crontab -e 0 */4 * * * /usr/local/bin/rsnapshot hourly 30 23 * * * /usr/local/bin/rsnapshot daily

    Read the article

  • DevExpress XAF, Behavior Driven Development (BDD), Domain Driven Development (DDD) and more&ndash;Introduction

    - by Patrick Liekhus
    OK.  I admit it.  I have been horrible at this blogging thing.  However, I have made a commitment to get better at it so here goes.  I have many crazy ideas when it comes to coding and how to make my processes better and now is the time to get them down on paper and get your feedback.  Now, these ideas might not be nearly as wild and crazy as Charlie Sheen, but at least they help me get through my coding assignments. So let’s start by laying out the vision and objectives of this exercise.  I have been trying to come up with the best set of tools, tips and practices so I can get a small team to be as productive as possible without burning out my resources.  My thoughts tend to lean towards the coding practices first as this is what I have been doing for years.  However, as one looks at the process as a whole, we need to remember to keep the users in mind.  If we don’t have a user to accept our application, do we really have an application in the first place? I have been using a commercial framework from DevExpress called eXpress Application Framework (XAF) with their eXpress Persistent Objects (XPO) behind the scenes for a few years.  We have had tremendous success with it and even implemented a code generation layer to save us some time.  Now we want more!!! My goals here are to create a technical stack that employs as many UI’s as possible, while being true to the layers and documenting the process along the way.  I will continue to have a series of these posts that will walk through each step as I work on it.  Right now here is what I have planned: Defining the solution SCRUM/Agile Story Planning Overview of Architectural Plan Feature Driven Development Domain Driven Development Persistence Layer with XPO Windows UI with XAF/XPO Web UI with XAF/XPO OData Services Layer Windows Mobile UI Android UI iPhone UI Blackberry UI Excel UI Outlook UI Lessons Learned I will explain the solution that I plan to implement in the next post.  Thanks again and let me know what you think.

    Read the article

  • Defining the Features we would like to see

    - by Patrick Liekhus
    OK, now that we have a very rough idea of what we are building, let’s get a list of the top features that this application needs to allow us to do.  In this next list we are not prioritizing them yet, just getting on paper the high level backlog of items that this system must do. Add a new task to my work queue Change the status of the task Print a hard copy of the task list by day for my records Log a phone conversation A manager should be able to assign tasks to another user How do we login? Change the Covey roles per user Manage the statuses used Manage the Covey quadrants Can we make this available on the following user interfaces? Windows Desktop Web Browser Sliverlight (WPF) Excel Add-in Outlook Add-in Android Devices iPhone Devices Windows Mobile Devices Blackberry Devices While this looks like a simple spread sheet, it can get pretty complex and busy quickly.  Next time we will work on making this into a Product Backlog and prioritizing the features we would like to see.

    Read the article

  • XAF DSL Tool Needs a new Team Lead

    - by Patrick Liekhus
    I have enjoyed my time on this project and have used it in several production projects.  However, with the enhancements in Visual Studio 2010 and the Entity Framework, the DSL tool doesn’t make sense for me to support at this time.  With that said, I am looking for someone who has interest to continue the project if they so desire.  I have moved my attention to creating a new project at Entity Framework Extensions for XAF.  We are converting the current DSL tool into the Entity Framework extensions.  The same code generation and everything else work.  However, the visual design surface is so much easier to work with.  If you have any questions, please let me know.  Also, please take a moment to look at the new project.  This is where all my effort going forward will be focused. Thanks again for all the support on my vision this far and enjoy.

    Read the article

  • What installation size should i use?

    - by Patrick
    I am installing Ubuntu with the Windows installer, however I am using Windows 8. I need to know if there will be problems with installing it using Windows 8, and I need to know which installation size I should use........4gb....18gb....etc. Also, is it possible to actually install it on a usb drive? the option was available in the installation, but i was not sure if it was safe or not. I would really appreciate some answers.

    Read the article

  • what are the benefits of closure, primarily for PHP?

    - by Patrick
    I am beginning the process of moving code over to PHP 5.3 and one of the most highly touted features of PHP 5.3 is the ability to use closures. My understanding of closures is that they allow anonymous functions, can be assigned to variable names, and have interesting scoping abilities. From my point of view the only seeming benefits in real world applications is the reduction of clutter in the namespace because closures are anonymous. Am I wrong in this? Should I be trying to put closures wherever I code? EDIT: I have already read this post on Javascript closures.

    Read the article

  • Best Books of C

    - by Patrick
    Hi, I realy want to get high skills in C programming and I know that the best and only way is hard work and lots of practice. though I found so many tutorials and books available on the net about learning the C language. I'm just looking for one or two good books in C that I can learn from and get high skills in C. Anyone knows such a great book/books for C programming pls? (sorry for replication if the question exists already in the forum) Regards!

    Read the article

  • NDepend 4 – First Steps

    - by Ricardo Peres
    Introduction Thanks to Patrick Smacchia I had the chance to test NDepend 4. I can only say: awesome! This will be the first of a series of posts on NDepend, where I will talk about my discoveries. Keep in mind that I am just starting to use it, so more experienced users may find these too basic, I just hope I don’t say anything foolish! I must say that I am in no way affiliated with NDepend and I never actually met Patrick. Installation No installation program – a curious decision, I’m not against it -, just unzip the files to a folder and run the executable. It will optionally register itself with Visual Studio 2008, 2010 and 11 as well as RedGate’s Reflector; also, it automatically looks for updates. NDepend can either be used as a stand-alone program (with or without a GUI) or from within Visual Studio or Reflector. Getting Started One thing that really pleases me is the Getting Started section of the stand-alone, with links to pages on NDepend’s web site, featuring detailed explanations, which usually include screenshots and small videos (<5 minutes). There’s also an How do I with hierarchical navigation that guides us to through the major features so that we can easily find what we want. Usage There are two basic ways to use NDepend: Analyze .NET solutions, projects or assemblies; Compare two versions of the same assembly. I have so far not used NDepend to compare assemblies, so I will first talk about the first option. After selecting a solution and some of its projects, it generates a single HTML page with an highly detailed report of the analysis it produced. This includes some metrics such as number of lines of code, IL instructions, comments, types, methods and properties, the calculation of the cyclomatic complexity, coupling and lots of others indicators, typically grouped by type, namespace and assembly. The HTML also includes some nice diagrams depicting assembly dependencies, type and method relative proportions (according to the number of IL instructions, I guess) and assembly analysis relating to abstractness and stability. Useful, I would say. Then there’s the rules; NDepend tests the target assemblies against a set of more than 120 rules, grouped in categories Code Quality, Object Oriented Design, Design, Architecture and Layering, Dead Code, Visibility, Naming Conventions, Source Files Organization and .NET Framework Usage. The full list can be configured on the application, and an explanation of each rule can be found on the web site. Rules can be validated, violated and violated in a critical manner, and the HTML will contain the violated rules, their queries – more on this later - and results. The HTML uses some nice JavaScript effects, which allow paging and sorting of tables, so its nice to use. Similar to the rules, there are some queries that display results for a number (about 200) questions grouped as Object Oriented Design, API Breaking Changes (for assembly version comparison), Code Diff Summary (also for version comparison) and Dead Code. The difference between queries and rules is that queries are not classified as passes, violated or critically violated, just present results. The queries and rules are expressed through CQLinq, which is a very powerful LINQ derivative specific to code analysis. All of the included rules and queries can be enabled or disabled and new ones can be added, with intellisense to help. Besides the HTML report file, the NDepend application can be used to explore all analysis results, compare different versions of analysis reports and to run custom queries. Comparison to Other Analysis Tools Unlike StyleCop, NDepend only works with assemblies, not source code, so you can’t expect it to be able to enforce brackets placement, for example. It is more similar to FxCop, but you don’t have the option to analyze at the IL level, that is, other that the number of IL instructions and the complexity. What’s Next In the next days I’ll continue my exploration with a real-life test case. References The NDepend web site is http://www.ndepend.com/. Patrick keeps an updated blog on http://codebetter.com/patricksmacchia/ and he regularly monitors StackOverflow for questions tagged NDepend, which you can find on http://stackoverflow.com/questions/tagged/ndepend. The default list of CQLinq rules, queries and statistics can be found at http://www.ndepend.com/DefaultRules/webframe.html. The syntax itself is described at http://www.ndepend.com/Doc_CQLinq_Syntax.aspx and its features at http://www.ndepend.com/Doc_CQLinq_Features.aspx.

    Read the article

  • Installing APC on lighttpd + php 5.2 on Ubuntu 10

    - by Patrick
    I've found the following tutorial to install APC on servers with lighttpd + php 5.2 on Ubuntu 10: http://www.assembla.com/wiki/show/socialinguatribe/Integrating_APC_Into_PHP5_And_Lighttpd However, when I run "sudo pecl install apc" the package is just downloaded and is not installed. (i.e. I'm not asked the next question" and apc.ini file is not created at all. If I run only "pecl install apc" I get a warning (no permissions to write some files). thanks

    Read the article

  • I Admit I Misspoke

    - by Patrick Liekhus
    OK.  I admit it.  The last post I hade mentioned that we moved the XAF DSL to the Entity Framework.  This has caused a lot of confusion.  I meant to say that we have used the ADO.NET Entity Data Model extensions.  This is the design surface that can be tailored to create Entity Framework. We leveraged the code generation within the ADO.NET Entity Data Model (EDMX) file to generate XAF/XPO classes.  This allows you to visually create the entity model, set a few XAF properties and then generate the business objects from there.  I am presenting all these topics at the Kansas City Developers Conference on June 19th.  I will post the presentation after the conference.  I have a full presentation that will demonstrate the power of the ADO.NET Entity Data Model extensions, create a small project and then add the OData layer to XAF to connect to the PowerPivot in Excel 2010. The latest code can be found at http://efxaf.codeplex.com. More details to come soon.  Sorry for the confusion in the last post. Thanks again.

    Read the article

  • Designing a Content-Based ETL Process with .NET and SFDC

    - by Patrick
    As my firm makes the transition to using SFDC as our main operational system, we've spun together a couple of SFDC portals where we can post customer-specific documents to be viewed at will. As such, we've had the need for pseudo-ETL applications to be implemented that are able to extract metadata from the documents our analysts generate internally (most are industry-standard PDFs, XML, or MS Office formats) and place in networked "queue" folders. From there, our applications scoop of the queued documents and upload them to the appropriate SFDC CRM Content Library along with some select pieces of metadata. I've mostly used DbAmp to broker communication with SFDC (DbAmp is a Linked Server provider that allows you to use SQL conventions to interact with your SFDC Org data). I've been able to create [console] applications in C# that work pretty well, and they're usually structured something like this: static void Main() { // Load parameters from app.config. // Get documents from queue. var files = someInterface.GetFiles(someFilterOrRegexPattern); foreach (var file in files) { // Extract metadata from the file. // Validate some attributes of the file; add any validation errors to an in-memory // structure (e.g. List<ValidationErrors>). if (isValid) { var fileData = File.ReadAllBytes(file); // Upload using some wrapper for an ORM or DAL someInterface.Upload(fileData, meta.Param1, meta.Param2, ...); } else { // Bounce the file } } // Report any validation errors (via message bus or SMTP or some such). } And that's pretty much it. Most of the time I wrap all these operations in a "Worker" class that takes the needed interfaces as constructor parameters. This approach has worked reasonably well, but I just get this feeling in my gut that there's something awful about it and would love some feedback. Is writing an ETL process as a C# Console app a bad idea? I'm also wondering if there are some design patterns that would be useful in this scenario that I'm clearly overlooking. Thanks in advance!

    Read the article

  • Wolfram is out, any alternatives? Or how to go custom?

    - by Patrick
    We were originally planning on using wolfram alpha api for a new project but unfortunately the cost was entirely way to high for what we were using it for. Essentially what we were doing is calculating the nutrition facts for food. (http://www.wolframalpha.com/input/?i=chicken+breast+with+broccoli). Before taking the step of trying to build something that may work in its place for this use case is there any open source code anywhere that can do this kind of analysis and compile the data? The hardest part in my opinion is what it has for assumptions and where it gets that data to power the calculations. Or another way to put it is, I cannot seem to wrap my head around building something that computes user input to return facts and knowledge. I know if I can convert the user input into some standardized form I can then compare that to a nutrition fact database to pull in the information I need. Does anyone know of any solutions to re-create this or APIs that can provide this kind of analysis? Thanks for any advice. I am trying to figure out if this project is dead in the water before it even starts. This kind of programming is well beyond me so I can only hope for an API, open source, or some kind of analysis engine to interpret user input when I know what kind of data they are entering (measurements and food).

    Read the article

  • Work Item Traceability in TFS 2010

    - by Sam Patrick
    I have created a Windows Form project (VS solution) under a TFS 2010 project. I may eventually add more solutions to the TFS project. My question: Can we create a Use Case WIT for a specific solution within a TFS project? Furthermore, is it possible to create a "traceability matrix" that starts at the Use Case level and goes down to the the code level (at least the namespace level) of that particular VS solution?

    Read the article

  • Error correlation ID when trying to create a BI center site in Sharepoint 2010

    - by Patrick Olurotimi Ige
    Before you get to see the template to install the BI center site you will have to activate the PerformancePoint Services Site Collection Features from the : Site Collection Administration  > site collection features But after i activated it and try to create a site i get correlation error bla bla... After looking at the ULS log files i saw something related to not being able to use the SharePoint Server Publishing. So i went to the collection features and activate the SharePoint Server Publishing Infrastructure I could create a BI site

    Read the article

  • How to install Ubiquity into a Live CD installation image?

    - by Patrick L
    I am trying a create a small Ubuntu installation ISO image. I am using a tool called Ubuntu-Builder. To make the final ISO as small as possible, I have decided to use Ubuntu Mini Remix. It is a small Live CD without GUI. It does not come with any installer software like Ubiquity. I want to embed an installer software into the ISO image so that user can install it into harddisk. In Ubuntu-Builder, I have tried the following: Install LXDE Desktop, then install Ubiquity. But the final ISO boots into command line. Install OpenBox Desktop, then install Ubiquity. But the final ISO boots into command line. Do not install DE, directly install Ubiquity. But the final ISO still boots into command line. After booting up from ISO, I have checked the software in the OS. It seems that Ubiquity has been installed. But it didn't show up when I boot the ISO image. Anyone knows how to install Ubiquity into a Live CD ISO image? Anyone knows any text mode installer which can replace Ubiquity?

    Read the article

  • How I might think like a hacker so that I can anticipate security vulnerabilities in .NET or Java before a hacker hands me my hat [closed]

    - by Matthew Patrick Cashatt
    Premise I make a living developing web-based applications for all form-factors (mobile, tablet, laptop, etc). I make heavy use of SOA, and send and receive most data as JSON objects. Although most of my work is completed on the .NET or Java stacks, I am also recently delving into Node.js. This new stack has got me thinking that I know reasonably well how to secure applications using known facilities of .NET and Java, but I am woefully ignorant when it comes to best practices or, more importantly, the driving motivation behind the best practices. You see, as I gain more prominent clientele, I need to be able to assure them that their applications are secure and, in order to do that, I feel that I should learn to think like a malevolent hacker. What motivates a malevolent hacker: What is their prime mover? What is it that they are most after? Ultimately, the answer is money or notoriety I am sure, but I think it would be good to understand the nuanced motivators that lead to those ends: credit card numbers, damning information, corporate espionage, shutting down a highly visible site, etc. As an extension of question #1--but more specific--what are the things most likely to be seeked out by a hacker in almost any application? Passwords? Financial info? Profile data that will gain them access to other applications a user has joined? Let me be clear here. This is not judgement for or against the aforementioned motivations because that is not the goal of this post. I simply want to know what motivates a hacker regardless of our individual judgement. What are some heuristics followed to accomplish hacker goals? Ultimately specific processes would be great to know; however, in order to think like a hacker, I would really value your comments on the broader heuristics followed. For example: "A hacker always looks first for the low-hanging fruit such as http spoofing" or "In the absence of a CAPTCHA or other deterrent, a hacker will likely run a cracking script against a login prompt and then go from there." Possibly, "A hacker will try and attack a site via Foo (browser) first as it is known for Bar vulnerability. What are the most common hacks employed when following the common heuristics? Specifics here. Http spoofing, password cracking, SQL injection, etc. Disclaimer I am not a hacker, nor am I judging hackers (Heck--I even respect their ingenuity). I simply want to learn how I might think like a hacker so that I may begin to anticipate vulnerabilities before .NET or Java hands me a way to defend against them after the fact.

    Read the article

  • SharePoint 2010 Launch

    - by Patrick Olurotimi Ige
    Some great news for sharepoint developers,architect,consultants etc... May 12, 2010 is the official release date for SharePoint 2010 & Office 2010. Also, Microsoft announced their intent to RTM (Release to Manufacturing) for April 2010. read more here

    Read the article

  • rsnapshot for remote backups...

    - by Patrick
    I want to use rsnapshot to make backups from my production server to a remote backups server. Should I install rsnapshot on the remote backup server and not the production one, right ? rsnapshot is going to pull the files to backup from the production server and store them locally on the backup server ? I've just realized that I don't have sudo privilegies on the backup server. Does this mean I cannot use rsnapshot for remote backups ? thanks

    Read the article

  • Is it just me or is this a baffling tech interview question

    - by Matthew Patrick Cashatt
    Background I was just asked in a tech interview to write an algorithm to traverse an "object" (notice the quotes) where A is equal to B and B is equal to C and A is equal to C. That's it. That is all the information I was given. I asked the interviewer what the goal was but apparently there wasn't one, just "traverse" the "object". I don't know about anyone else, but this seems like a silly question to me. I asked again, "am I searching for a value?". Nope. Just "traverse" it. Why would I ever want to endlessly loop through this "object"?? To melt my processor maybe?? The answer according to the interviewer was that I should have written a recursive function. OK, so why not simply ask me to write a recursive function? And who would write a recursive function that never ends? My question: Is this a valid question to the rest of you and, if so, can you provide a hint as to what I might be missing? Perhaps I am thinking too hard about solving real world problems. I have been successfully coding for a long time but this tech interview process makes me feel like I don't know anything. Final Answer: CLOWN TRAVERSAL!!! (See @Matt's answer below) Thanks! Matt

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >