Search Results

Search found 927 results on 38 pages for 'patrick moriarty'.

Page 4/38 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • paypal IPN sends two different twice

    - by Patrick
    I've come across something a bit strange I was hoping someone with more experience with Paypal can explain, Specifically the IPN feature. It seems I'm getting two very different hits to my IPN listener. The first one always fails, The second one passes. Now I know Paypal tends to send duplicates, But what I've noticed is two very different $_POST arrays being recieved. Here's the respones : [2014-06-08 23:51:19] RAW POST DATA : Array ( [transaction] => Array ( [0] => ILS 20.00 ) [payment_request_date] => Sun Jun 08 13:52:12 PDT 2014 [return_url] => MY_URL [fees_payer] => EACHRECEIVER [ipn_notification_url] => MY_URL [sender_email] => patrick[email protected] //fake email [verify_sign] => ANp5TpLat3.2ylx.cECtVZ..5HejAsVcs05tdVC7RldmeYNJ91SKaqFJ [test_ipn] => 1 [cancel_url] => MY_URL [pay_key] => AP-04B74091M7083584A [action_type] => PAY [transaction_type] => Adaptive Payment PAY [tracking_id] => 13 // This is a number I passed, But it doesn't exist in the 2nd POST [status] => COMPLETED [log_default_shipping_address_in_transaction] => false [charset] => windows-1252 [notify_version] => UNVERSIONED [reverse_all_parallel_payments_on_error] => false ) [2014-06-08 23:51:19] RAW POST DATA : Array ( [transaction_subject] => [payment_date] => 13:52:28 Jun 08, 2014 PDT [txn_type] => web_accept [last_name] => test [residence_country] => US [item_name] => .... (this continues for quite a bit more) .... [payment_fee] => [mc_fee] => 1.78 [mc_gross] => 20.00 [custom] => [charset] => windows-1252 [notify_version] => 3.8 [ipn_track_id] => f93ce8bdd4382 ) My problem The first IPN with the juicy tracking_id fails, the 2nd IPN is verified, But once the IPN is verified I no longer have access to the tracking_id. My questions Why does paypal send two different IPN's Why are they different? Why isn't any of this documented on Paypal? :(

    Read the article

  • Session state has been disabled for ASP.NET.The Report Viewer control requires that session state be

    - by Patrick Olurotimi Ige
    While i was trying to setup some reports on a SPF 2010 site. After trying to add a Sql Reporting Webpart i get the error: Session state has been disabled for ASP.NET. The Report Viewer control requires that session state be enabled in local mode I never came across that error before when using RSWebaprts in Sharepoint 2007:) But i think is related to ASP.NET sessions state:( Any to fix it you would have to start to make sure the SharePoint Server ASP.NET Session State Service is enabled. Unfortunately in Sharepiint SPF2010 and SP 2010 you would have to do it using PowerShell By doing :  Enable-SPSessionStateService -Defaultprovision PS C:\> PSSnapin - Shows you all the PS Snapins PS C:\> Add-PSSnapin "Microsoft.SharePoint.PowerShell" -- Adds the Sharepoint PS Snapin PS C:\> get-command -Noun SP* -- Show all SP cmdlets PS C:\> Add-SPShellAdmin -- (If you get error regarding the access to the farm you use this command to add an acct to able to run the Shell admin) PS C:\> Get-Help Enable-SPSessionStateService - Shows helpp for  Enable-SPSessionStateService PS C:\> Enable-SPSessionStateService -Defaultprovision - (enables/activates SPSessionStateService) you have more oprions available when you run the Get-Help Enable-SPSessionStateService   After running the cmd you should see the SharePoint Server ASP.NET Session State Service started in your service applications in the Central Admin Site. And of course be able to add your RS webparts Enjoy

    Read the article

  • ATI graphics card overriding onboard Intel graphics

    - by Patrick
    I have an ATI graphics card with an AGP connection and a DVI port, and an Intel graphics processor on the motherboard with a VGA port. I have two monitors. When the graphics card is unplugged, I get this from lspci: 00:00.0 Host bridge: Intel Corporation 82G33/G31/P35/P31 Express DRAM Controller (rev 10) 00:02.0 VGA compatible controller: Intel Corporation 82G33/G31 Express Integrated Graphics Controller (rev 10) When it is attached, I get this: 00:00.0 Host bridge: Intel Corporation 82G33/G31/P35/P31 Express DRAM Controller (rev 10) 01:00.0 VGA compatible controller: ATI Technologies Inc RV380 [Radeon X600 (PCIE)] 01:00.1 Display controller: ATI Technologies Inc RV380 [Radeon X600] Note the absence of 00:02.0 - this is apparently causing the VGA monitor to say "No Signal" and the only the DVI display to show up in the list. I checked for problems in xorg.conf... there isn't one. I have no idea how to build one.

    Read the article

  • This web part does not have a valid XSLT stylesheet: There is no XSL property available for presenting the data.

    - by Patrick Olurotimi Ige
    I have been thinking for a while how i can reuse my code when building custom dataview webparts in sharepoint designer 2010.So i decided to use the XslLink which is one of the properties when you edit a sharepoint webpart.I started by creating a xsl file that i can use but after adding the link to the file like so:<XslLink>sites/server/mycustomtemplate.xsl</XslLink>I get the error : This web part does not have  a valid XSLT stylesheet: There is no XSL property available for presenting the data.So after some debugging i noticed it was the directory path for the link to the XSL style shee gets broken.So i changed it to  the full URL  http://mysite/sites/server/mycustomtemplate.xsl it works Enjoy

    Read the article

  • Error: The base type 'System.Web.UI.MasterPage' is not allowed for this page

    - by Patrick Olurotimi Ige
    I came across this error when i was trying to ajaxify my sharepoint site. After adding the AjaxifyMoss from the codeplex  developed by Richard Finn. And tried loading my site i got the error Error: The base type 'System.Web.UI.MasterPage' is not allowed for this page So  i decided to check the web.config and i noticed the SafeControl tag doesn't have the .Net 2.0 assembly included despite the fact i added both vsersios 2.0 and 3.5. Its possible the.Net  3.5 assebply overwrote the 2.0. Anyway after i added the below which is the 2.0 verison       <SafeControl Assembly="System.Web, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" Namespace="System.Web.UI" TypeName="*" Safe="True" AllowRemoteDesigner="True" />  And refreshed my page. It worked

    Read the article

  • To ORM or Not to ORM. That is the question&hellip;

    - by Patrick Liekhus
    UPDATE:  Thanks for the feedback and comments.  I have adjusted my table below with your recommendations.  I had missed a point or two. I wanted to do a series on creating an entire project using the EDMX XAF code generation and the SpecFlow BDD Easy Test tools discussed in my earlier posts, but I thought it would be appropriate to start with a simple comparison and reasoning on why I choose to use these tools. Let’s start by defining the term ORM, or Object-Relational Mapping.  According to Wikipedia it is defined as the following: Object-relational mapping (ORM, O/RM, and O/R mapping) in computer software is a programming technique for converting data between incompatible type systems in object-oriented programming languages. This creates, in effect, a "virtual object database" that can be used from within the programming language. Why should you care?  Basically it allows you to map your business objects in code to their persistence layer behind them. And better yet, why would you want to do this?  Let me outline it in the following points: Development speed.  No more need to map repetitive tasks query results to object members.  Once the map is created the code is rendered for you. Persistence portability.  The ORM knows how to map SQL specific syntax for the persistence engine you choose.  It does not matter if it is SQL Server, Oracle and another database of your choosing. Standard/Boilerplate code is simplified.  The basic CRUD operations are consistent and case use database metadata for basic operations. So how does this help?  Well, let’s compare some of the ORM tools that I have used and/or researched.  I have been interested in ORM for some time now.  My ORM of choice for a long time was NHibernate and I still believe it has a strong case in some business situations.  However, you have to take business considerations into account and the law of diminishing returns.  Because of these two factors, my recent activity and experience has been around DevExpress eXpress Persistence Objects (XPO).  The primary reason for this is because they have the DevExpress eXpress Application Framework (XAF) that sits on top of XPO.  With this added value, the data model can be created (either database first of code first) and the Web and Windows client can be created from these maps.  While out of the box they provide some simple list and detail screens, you can verify easily extend and modify these to your liking.  DevExpress has done a tremendous job of providing enough framework while also staying out of the way when you need to extend it.  This sounds worse than it really is.  What I mean by this is that if you choose to follow DevExpress coding style and recommendations, the hooks and extension points provided allow you to do some pretty heavy lifting while also not worrying about the basics. I have put together a list of the top features that I have used to compare the limited list of ORM’s that I have exposure with.  Again, the biggest selling point in my opinion is that XPO is just a solid as any of the other ORM’s but with the added layer of XAF they become unstoppable.  And then couple that with the EDMX modeling tools and code generation, it becomes a no brainer. Designer Features Entity Framework NHibernate Fluent w/ Nhibernate Telerik OpenAccess DevExpress XPO DevExpress XPO/XAF plus Liekhus Tools Uses XML to map relationships - Yes - - -   Visual class designer interface Yes - - - - Yes Management integrated w/ Visual Studio Yes - - Yes - Yes Supports schema first approach Yes - - Yes - Yes Supports model first approach Yes - - Yes Yes Yes Supports code first approach Yes Yes Yes Yes Yes Yes Attribute driven coding style Yes - Yes - Yes Yes                 I have a very small team and limited resources with a lot of responsibilities.  In order to keep up with our customers, we must rely on tools like these.  We use the EDMX tool so that we can create a visual representation of the applications with our customers.  Second, we rely on the code generation so that we can focus on the business problems at hand and not whether a field is mapped correctly.  This keeps us from requiring as many junior level developers on our team.  I have also worked on multiple teams where they believed in writing their own “framework”.  In my experiences and opinion this is not the route to take unless you have a team dedicated to supporting just the framework.  Each time that I have worked on custom frameworks, the framework eventually becomes old, out dated and full of “performance” enhancements specific to one or two requirements.  With an ORM, there are a lot smarter people than me working on the bigger issue of persistence and performance.  Again, my recommendation would be to use an available framework and get to working on your business domain problems.  If your coding is not making money for you, why are you working on it?  Do you really need to be writing query to object member code again and again? Thanks

    Read the article

  • Setup and Use SpecFlow BDD with DevExpress XAF

    - by Patrick Liekhus
    Let’s get started with using the SpecFlow BDD syntax for writing tests with the DevExpress XAF EasyTest scripting syntax.  In order for this to work you will need to download and install the prerequisites listed below.  Once they are installed follow the steps outlined below and enjoy. Prerequisites Install the following items: DevExpress eXpress Application Framework (XAF) found here SpecFlow found here Liekhus BDD/XAF Testing library found here Assumptions I am going to assume at this point that you have created your XAF application and have your Module, Win.Module and Win ready for usage.  You should have also set any attributes and/or settings as you see fit. Setup So where to start. Create a new testing project within your solution. I typically call this with a similar naming convention as used by XAF, my project name .FunctionalTests (i.e. AlbumManager.FunctionalTests). Add the following references to your project.  It should look like the reference list below. DevExpress.Data.v11.x DevExpress.Persistent.Base.v11.x DevExpress.Persistent.BaseImpl.v11.x DevExpress.Xpo.v11.2 Liekhus.Testing.BDD.Core Liekhus.Testing.BDD.DevExpress TechTalk.SpecFlow TestExecutor.v11.x (found in %Program Files%\DevExpress 2011.x\eXpressApp Framework\Tools\EasyTest Right click the TestExecutor reference and set the “Copy Local” setting to True.  This forces the TestExecutor executable to be available in the bin directory which is where the EasyTest script will be executed further down in the process. Add an Application Configuration File (app.config) to your test application.  You will need to make a few modifications to have SpecFlow generate Microsoft style unit tests.  First add the section handler for SpecFlow and then set your choice of testing framework.  I prefer MS Tests for my projects. Add the EasyTest configuration file to your project.  Add a new XML file and call it Config.xml. Open the properties window for the Config.xml file and set the “Copy to Ouput Directory” to “Copy Always”. You will setup the Config file according to the specifications of the EasyTest library my mapping to your executable and other settings.  You can find the details for the configuration of EasyTest here.  My file looks like this Create a new folder in your test project called “StepDefinitions”.  Add a new SpecFlow Step Definition file item under the StepDefinitions folder.  I typically call this class StepDefinition.cs. Have your step definition inherit from the Liekhus.Testing.BDD.DevExpress.StepDefinition class.  This will give you the default behaviors for your test in the next section. OK.  Now that we have done this series of steps, we will work on simplifying this.  This is an early preview of this new project and is not fully ready for consumption.  If you would like to experiment with it, please feel free.  Our goals are to make this a installable project on it’s own with it’s own project templates and default settings.  This will be coming in later versions.  Currently this project is in Alpha release. Let’s write our first test Remove the basic test that is created for you. We will not use the default test but rather create our own SpecFlow “Feature” files. Add a new item to your project and select the SpecFlow Feature file under C#. Name your feature file as you do your class files after the test they are performing. Writing a feature file uses the Cucumber syntax of Given… When… Then.  Think of it in these terms.  Givens are the pre-conditions for the test.  The Whens are the actual steps for the test being performed.  The Thens are the verification steps that confirm your test either passed or failed.  All of these steps are generated into a an EasyTest format and executed against your XAF project.  You can find more on the Cucumber syntax by using the Secret Ninja Cucumber Scrolls.  This document has several good styles of tests, plus you can get your fill of Chuck Norris vs Ninjas.  Pretty humorous document but full of great content. My first test is going to test the entry of a new Album into the application and is outlined below. The Feature section at the top is more for your documentation purposes.  Try to be descriptive of the test so that it makes sense to the next person behind you.  The Scenario outline is described in the Ninja Scrolls, but think of it as test template.  You can write one test outline and have multiple datasets (Scenarios) executed against that test.  Here are the steps of my test and their descriptions Given I am starting a new test – tells our test to create a new EasyTest file And (Given) the application is open – tells EasyTest to open our application defined in the Config.xml When I am at the “Albums” screen – tells XAF to navigate to the Albums list view And (When) I click the “New:Album” button – tells XAF to click the New Album button on the ribbon And (When) I enter the following information – tells XAF to find the field on the screen and put the value in that field And (When) I click the “Save and Close” button – tells XAF to click the “Save and Close” button on the detail window Then I verify results as “user” – tells the testing framework to execute the EasyTest as your configured user Once you compile and prepare your tests you should see the following in your Test View.  For each of your CreateNewAlbum lines in your scenarios, you will see a new test ready to execute. From here you will use your testing framework of choice to execute the test.  This in turn will execute the EasyTest framework to call back into your XAF application and test your business application. Again, please remember that this is an early preview and we are still working out the details.  Please let us know if you have any comments/questions/concerns. Thanks and happy testing.

    Read the article

  • /users/tags should contain scores

    - by Sean Patrick Floyd
    I am implementing some simple JavaScript/bookmarklet based apps that show some reputation info, including the score in the User's top tags (roughly based on this previous bookmarklet of mine). Now I can get a user's top tags (using the API), and I can also get the per-tag score if the user is logged in, by dynamically parsing the tag's top users page. But it costs me one AJAX request per tag and I have to download 10+k to extract a single numeric value. It would save a lot of traffic if the tags in <api>/users/<userid>/tags had a score field. The data seems to be there, after all the top users pages use it, so it would just be a question of exposing the data. Suggested structure: "tags": [ { "name": { "description": "name of the tag", "values": "string", "optional": false, "suggested_buffer_size": 25 }, "score": { "description": "tag score, sum of up votes for answers on non-wiki questions", "values": "32-bit signed integer", "optional": false }, "count": { "description": "tag count, exact meaning depends on context", "values": "32-bit signed integer", "optional": false }, "restricted_to": { "description": "user types that can make use of this tag, lack of this field indicates it is useable by all", "values": "one of anonymous, unregistered, registered, or moderator", "optional": true }, "fulfills_required": { "description": "indicates whether this tag is one of those that is required to be on a post", "values": "boolean", "optional": false }, "user_id": { "description": "user associated with this tag, depends on context", "values": "32-bit signed integer", "optional": true } } ]

    Read the article

  • IASA Kansas City to host discussion on Google Fiber Project in Kansas City

    - by Patrick Liekhus
    One of the groups that I am currently President of (IASA Kansas City) is hosting an event by Rachel Hack (Google Community Manager) about the Google Fiber Project in Kansas City.  The event will be hosted at Balance Point’s office off 92nd and Ward Parkway on the Missouri side of the state line.  If you are interested, please check out further details here and get registered.  It is after work hours from 6 to 8 PM on the night of November 29, 2011.  It is free to attend and open to anyone who gets registered.  Come one, come all and bring your friends. Thanks

    Read the article

  • Useful Sharepoint Goodies

    - by Patrick Olurotimi Ige
    I came across this list of very interesting stuff below (and it could save lots for time) 1. Faceted Search: http://facetedsearch.codeplex.com/ 2. Podcasting Kit for SharePoint: http://pks.codeplex.com/ 3. Knowledge Base: http://spkb.codeplex.com/ 4. SharePoint Branding Tool: http://brandingtool.codeplex.com/ 5. SharePoint User Account Control: http://spuac.codeplex.com/ 6. SharePoint Enhanced Calendar: http://spenhancedcalendar.codeplex.com/ 7. Enhanced Discussion Board: http://edb.codeplex.com/ 8. Wildcard Search: http://spwildcardsearch.codeplex.com/ 9. SharePoint Usage Logging Kit: http://sulk.codeplex.com/ 10. SharePoint Zip: http://sharepointzip.codeplex.com/ 11. Facebook Kit for SharePoint: http://fks.codeplex.com/ 12. Short Messages: http://spmessaging.codeplex.com/ 13. Color coded calendar: http://planetwilson.codeplex.com/Release/ProjectReleases.aspx?ReleaseId=11814 14. Most Popular Pages on SharePoint: http://popularpages.codeplex.com/   Thanks to my two bits  heput the list together

    Read the article

  • rsnapshot intervals in configuration file…

    - by Patrick
    A simple question about rsnapshot. In order to perform daily backups I'm going to add lines to cron in my Ubuntu. Then, why do I have also these lines in the rsnapshot.conf ? ######################################### # BACKUP INTERVALS # # Must be unique and in ascending order # # i.e. hourly, daily, weekly, etc. # ######################################### interval hourly 6 interval daily 7 interval weekly 4 #interval monthly 3 If I use cron, should I disable them ? thanks ps. I've just realized that in the crontab I still have "hourly" and "daily". Should I then uncomment only the one I use in the crontab ? And what's the point to specify hourly if it is already specified in cron ? I'm a bit confused. # crontab -e 0 */4 * * * /usr/local/bin/rsnapshot hourly 30 23 * * * /usr/local/bin/rsnapshot daily

    Read the article

  • DevExpress XAF, Behavior Driven Development (BDD), Domain Driven Development (DDD) and more&ndash;Introduction

    - by Patrick Liekhus
    OK.  I admit it.  I have been horrible at this blogging thing.  However, I have made a commitment to get better at it so here goes.  I have many crazy ideas when it comes to coding and how to make my processes better and now is the time to get them down on paper and get your feedback.  Now, these ideas might not be nearly as wild and crazy as Charlie Sheen, but at least they help me get through my coding assignments. So let’s start by laying out the vision and objectives of this exercise.  I have been trying to come up with the best set of tools, tips and practices so I can get a small team to be as productive as possible without burning out my resources.  My thoughts tend to lean towards the coding practices first as this is what I have been doing for years.  However, as one looks at the process as a whole, we need to remember to keep the users in mind.  If we don’t have a user to accept our application, do we really have an application in the first place? I have been using a commercial framework from DevExpress called eXpress Application Framework (XAF) with their eXpress Persistent Objects (XPO) behind the scenes for a few years.  We have had tremendous success with it and even implemented a code generation layer to save us some time.  Now we want more!!! My goals here are to create a technical stack that employs as many UI’s as possible, while being true to the layers and documenting the process along the way.  I will continue to have a series of these posts that will walk through each step as I work on it.  Right now here is what I have planned: Defining the solution SCRUM/Agile Story Planning Overview of Architectural Plan Feature Driven Development Domain Driven Development Persistence Layer with XPO Windows UI with XAF/XPO Web UI with XAF/XPO OData Services Layer Windows Mobile UI Android UI iPhone UI Blackberry UI Excel UI Outlook UI Lessons Learned I will explain the solution that I plan to implement in the next post.  Thanks again and let me know what you think.

    Read the article

  • Defining the Features we would like to see

    - by Patrick Liekhus
    OK, now that we have a very rough idea of what we are building, let’s get a list of the top features that this application needs to allow us to do.  In this next list we are not prioritizing them yet, just getting on paper the high level backlog of items that this system must do. Add a new task to my work queue Change the status of the task Print a hard copy of the task list by day for my records Log a phone conversation A manager should be able to assign tasks to another user How do we login? Change the Covey roles per user Manage the statuses used Manage the Covey quadrants Can we make this available on the following user interfaces? Windows Desktop Web Browser Sliverlight (WPF) Excel Add-in Outlook Add-in Android Devices iPhone Devices Windows Mobile Devices Blackberry Devices While this looks like a simple spread sheet, it can get pretty complex and busy quickly.  Next time we will work on making this into a Product Backlog and prioritizing the features we would like to see.

    Read the article

  • XAF DSL Tool Needs a new Team Lead

    - by Patrick Liekhus
    I have enjoyed my time on this project and have used it in several production projects.  However, with the enhancements in Visual Studio 2010 and the Entity Framework, the DSL tool doesn’t make sense for me to support at this time.  With that said, I am looking for someone who has interest to continue the project if they so desire.  I have moved my attention to creating a new project at Entity Framework Extensions for XAF.  We are converting the current DSL tool into the Entity Framework extensions.  The same code generation and everything else work.  However, the visual design surface is so much easier to work with.  If you have any questions, please let me know.  Also, please take a moment to look at the new project.  This is where all my effort going forward will be focused. Thanks again for all the support on my vision this far and enjoy.

    Read the article

  • What installation size should i use?

    - by Patrick
    I am installing Ubuntu with the Windows installer, however I am using Windows 8. I need to know if there will be problems with installing it using Windows 8, and I need to know which installation size I should use........4gb....18gb....etc. Also, is it possible to actually install it on a usb drive? the option was available in the installation, but i was not sure if it was safe or not. I would really appreciate some answers.

    Read the article

  • what are the benefits of closure, primarily for PHP?

    - by Patrick
    I am beginning the process of moving code over to PHP 5.3 and one of the most highly touted features of PHP 5.3 is the ability to use closures. My understanding of closures is that they allow anonymous functions, can be assigned to variable names, and have interesting scoping abilities. From my point of view the only seeming benefits in real world applications is the reduction of clutter in the namespace because closures are anonymous. Am I wrong in this? Should I be trying to put closures wherever I code? EDIT: I have already read this post on Javascript closures.

    Read the article

  • Best Books of C

    - by Patrick
    Hi, I realy want to get high skills in C programming and I know that the best and only way is hard work and lots of practice. though I found so many tutorials and books available on the net about learning the C language. I'm just looking for one or two good books in C that I can learn from and get high skills in C. Anyone knows such a great book/books for C programming pls? (sorry for replication if the question exists already in the forum) Regards!

    Read the article

  • NDepend 4 – First Steps

    - by Ricardo Peres
    Introduction Thanks to Patrick Smacchia I had the chance to test NDepend 4. I can only say: awesome! This will be the first of a series of posts on NDepend, where I will talk about my discoveries. Keep in mind that I am just starting to use it, so more experienced users may find these too basic, I just hope I don’t say anything foolish! I must say that I am in no way affiliated with NDepend and I never actually met Patrick. Installation No installation program – a curious decision, I’m not against it -, just unzip the files to a folder and run the executable. It will optionally register itself with Visual Studio 2008, 2010 and 11 as well as RedGate’s Reflector; also, it automatically looks for updates. NDepend can either be used as a stand-alone program (with or without a GUI) or from within Visual Studio or Reflector. Getting Started One thing that really pleases me is the Getting Started section of the stand-alone, with links to pages on NDepend’s web site, featuring detailed explanations, which usually include screenshots and small videos (<5 minutes). There’s also an How do I with hierarchical navigation that guides us to through the major features so that we can easily find what we want. Usage There are two basic ways to use NDepend: Analyze .NET solutions, projects or assemblies; Compare two versions of the same assembly. I have so far not used NDepend to compare assemblies, so I will first talk about the first option. After selecting a solution and some of its projects, it generates a single HTML page with an highly detailed report of the analysis it produced. This includes some metrics such as number of lines of code, IL instructions, comments, types, methods and properties, the calculation of the cyclomatic complexity, coupling and lots of others indicators, typically grouped by type, namespace and assembly. The HTML also includes some nice diagrams depicting assembly dependencies, type and method relative proportions (according to the number of IL instructions, I guess) and assembly analysis relating to abstractness and stability. Useful, I would say. Then there’s the rules; NDepend tests the target assemblies against a set of more than 120 rules, grouped in categories Code Quality, Object Oriented Design, Design, Architecture and Layering, Dead Code, Visibility, Naming Conventions, Source Files Organization and .NET Framework Usage. The full list can be configured on the application, and an explanation of each rule can be found on the web site. Rules can be validated, violated and violated in a critical manner, and the HTML will contain the violated rules, their queries – more on this later - and results. The HTML uses some nice JavaScript effects, which allow paging and sorting of tables, so its nice to use. Similar to the rules, there are some queries that display results for a number (about 200) questions grouped as Object Oriented Design, API Breaking Changes (for assembly version comparison), Code Diff Summary (also for version comparison) and Dead Code. The difference between queries and rules is that queries are not classified as passes, violated or critically violated, just present results. The queries and rules are expressed through CQLinq, which is a very powerful LINQ derivative specific to code analysis. All of the included rules and queries can be enabled or disabled and new ones can be added, with intellisense to help. Besides the HTML report file, the NDepend application can be used to explore all analysis results, compare different versions of analysis reports and to run custom queries. Comparison to Other Analysis Tools Unlike StyleCop, NDepend only works with assemblies, not source code, so you can’t expect it to be able to enforce brackets placement, for example. It is more similar to FxCop, but you don’t have the option to analyze at the IL level, that is, other that the number of IL instructions and the complexity. What’s Next In the next days I’ll continue my exploration with a real-life test case. References The NDepend web site is http://www.ndepend.com/. Patrick keeps an updated blog on http://codebetter.com/patricksmacchia/ and he regularly monitors StackOverflow for questions tagged NDepend, which you can find on http://stackoverflow.com/questions/tagged/ndepend. The default list of CQLinq rules, queries and statistics can be found at http://www.ndepend.com/DefaultRules/webframe.html. The syntax itself is described at http://www.ndepend.com/Doc_CQLinq_Syntax.aspx and its features at http://www.ndepend.com/Doc_CQLinq_Features.aspx.

    Read the article

  • Installing APC on lighttpd + php 5.2 on Ubuntu 10

    - by Patrick
    I've found the following tutorial to install APC on servers with lighttpd + php 5.2 on Ubuntu 10: http://www.assembla.com/wiki/show/socialinguatribe/Integrating_APC_Into_PHP5_And_Lighttpd However, when I run "sudo pecl install apc" the package is just downloaded and is not installed. (i.e. I'm not asked the next question" and apc.ini file is not created at all. If I run only "pecl install apc" I get a warning (no permissions to write some files). thanks

    Read the article

  • I Admit I Misspoke

    - by Patrick Liekhus
    OK.  I admit it.  The last post I hade mentioned that we moved the XAF DSL to the Entity Framework.  This has caused a lot of confusion.  I meant to say that we have used the ADO.NET Entity Data Model extensions.  This is the design surface that can be tailored to create Entity Framework. We leveraged the code generation within the ADO.NET Entity Data Model (EDMX) file to generate XAF/XPO classes.  This allows you to visually create the entity model, set a few XAF properties and then generate the business objects from there.  I am presenting all these topics at the Kansas City Developers Conference on June 19th.  I will post the presentation after the conference.  I have a full presentation that will demonstrate the power of the ADO.NET Entity Data Model extensions, create a small project and then add the OData layer to XAF to connect to the PowerPivot in Excel 2010. The latest code can be found at http://efxaf.codeplex.com. More details to come soon.  Sorry for the confusion in the last post. Thanks again.

    Read the article

  • Designing a Content-Based ETL Process with .NET and SFDC

    - by Patrick
    As my firm makes the transition to using SFDC as our main operational system, we've spun together a couple of SFDC portals where we can post customer-specific documents to be viewed at will. As such, we've had the need for pseudo-ETL applications to be implemented that are able to extract metadata from the documents our analysts generate internally (most are industry-standard PDFs, XML, or MS Office formats) and place in networked "queue" folders. From there, our applications scoop of the queued documents and upload them to the appropriate SFDC CRM Content Library along with some select pieces of metadata. I've mostly used DbAmp to broker communication with SFDC (DbAmp is a Linked Server provider that allows you to use SQL conventions to interact with your SFDC Org data). I've been able to create [console] applications in C# that work pretty well, and they're usually structured something like this: static void Main() { // Load parameters from app.config. // Get documents from queue. var files = someInterface.GetFiles(someFilterOrRegexPattern); foreach (var file in files) { // Extract metadata from the file. // Validate some attributes of the file; add any validation errors to an in-memory // structure (e.g. List<ValidationErrors>). if (isValid) { var fileData = File.ReadAllBytes(file); // Upload using some wrapper for an ORM or DAL someInterface.Upload(fileData, meta.Param1, meta.Param2, ...); } else { // Bounce the file } } // Report any validation errors (via message bus or SMTP or some such). } And that's pretty much it. Most of the time I wrap all these operations in a "Worker" class that takes the needed interfaces as constructor parameters. This approach has worked reasonably well, but I just get this feeling in my gut that there's something awful about it and would love some feedback. Is writing an ETL process as a C# Console app a bad idea? I'm also wondering if there are some design patterns that would be useful in this scenario that I'm clearly overlooking. Thanks in advance!

    Read the article

  • Wolfram is out, any alternatives? Or how to go custom?

    - by Patrick
    We were originally planning on using wolfram alpha api for a new project but unfortunately the cost was entirely way to high for what we were using it for. Essentially what we were doing is calculating the nutrition facts for food. (http://www.wolframalpha.com/input/?i=chicken+breast+with+broccoli). Before taking the step of trying to build something that may work in its place for this use case is there any open source code anywhere that can do this kind of analysis and compile the data? The hardest part in my opinion is what it has for assumptions and where it gets that data to power the calculations. Or another way to put it is, I cannot seem to wrap my head around building something that computes user input to return facts and knowledge. I know if I can convert the user input into some standardized form I can then compare that to a nutrition fact database to pull in the information I need. Does anyone know of any solutions to re-create this or APIs that can provide this kind of analysis? Thanks for any advice. I am trying to figure out if this project is dead in the water before it even starts. This kind of programming is well beyond me so I can only hope for an API, open source, or some kind of analysis engine to interpret user input when I know what kind of data they are entering (measurements and food).

    Read the article

  • Work Item Traceability in TFS 2010

    - by Sam Patrick
    I have created a Windows Form project (VS solution) under a TFS 2010 project. I may eventually add more solutions to the TFS project. My question: Can we create a Use Case WIT for a specific solution within a TFS project? Furthermore, is it possible to create a "traceability matrix" that starts at the Use Case level and goes down to the the code level (at least the namespace level) of that particular VS solution?

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >