Search Results

Search found 9975 results on 399 pages for 'enterprise architecture'.

Page 21/399 | < Previous Page | 17 18 19 20 21 22 23 24 25 26 27 28  | Next Page >

  • DAL, Session, Cache architecture

    - by subt13
    I'm attempting to create Data Access Layer for my web application. Currently, all datatables are stored in the session. When I am finished the DAL will populate and return datatables. Is it a good idea to store the returned datatables in the session? A distributed/shared cache? Or just ping the database each time? Note: generally the number of rows in the datatable will be small < 2000. Thanks

    Read the article

  • Design/Architecture Advice Needed

    - by Rachel
    Summary: I have different components on homepage and each components shows some promotion to the user. I have Cart as one Component and depending upon content of the cart promotion are show. I have to track user online activities and send that information to Omniture for Report Generation. Now my components are loaded asynchronously basically are loaded when AjaxRequest is fired up and so there is not fix pattern or rather information on when components will appear on the webpages. Now in order to pass information to Omniture I need to call track function on $(document).(ready) and append information for each components(7 parameters are required by Omniture for each component). So in the init:config function of each component am calling Omniture and passing paramters but now no. of Omniture calls is directly proportional to no. of Components on the webpage but this is not acceptable as each call to Omniture is very expensive. Now I am looking for a way where in I can club the information about 7 parameters and than make one Call to Omniture wherein I pass those information. Points to note is that I do not know when the components are loaded and so there is no pre-defined time or no. of components that would be loaded. The thing is am calling track function when document is ready but components are loaded after call to Omniture has been made and so my question is Q: How can I collect the information for all the components and than just make one call to Omniture to send those information ? As mentioned, I do not know when the components are loaded as they are done on the Ajax Request. Hope I am able to explain my challenge and would appreciate if some one can provide from Design/Architect Solutions for the Challenge.

    Read the article

  • sharp architecture mapping error

    - by fez
    this is the error when i load product entity my code is Configuration cfg = new NHibernate.Cfg.Configuration(); ISessionFactory sessions; public MedicineController() //Construtor { cfg.Configure(); sessions = cfg.BuildSessionFactory(); } using (var session = sessions.OpenSession()) { var pGet = session.Get<Product>(0); } The Error is Unable to locate persister for the entity named 'SharpArchitecture.Domain.Product'. The persister define the persistence strategy for an entity. Possible causes: - The mapping for 'SharpArchitecture.Domain.Product' was not added to the NHibernate configuration. Thanks in advance

    Read the article

  • [Architecture] Roles for white-label service access.

    - by saurabhj
    Okay, I know I'm doing something wrong - but can't figure out a better way. I am developing a website which is going to allow users to setup their own mini-websites. Something like Ning. Also, I have only 1 basic login and access to each mini website is provided (right now) via roles. So the way I am doing this right now is: Everytime a new mini website is created - say blah, I create 2 roles in my application. blah_users and blah_admin The user creating the mini website is given the role - blah_admin and every other user wanting to join this mini website (or network) is given the role - blah_user. Anyone can view data from any website. However to add data, one must be a member of that mini site (must have the blah_user role assigned) The problem that I am facing is that by doing a role based system, I'm having to do loads of stuff manually. Asp.Net 2 controls which work on the User.IsAunthenticated property are basically useless to me now because along with the IsAuthenticated property, I must also check if the user has the proper role. I'm guessing there is a better way to architect the system but I am not sure how. Any ideas? This website is being developed in ASP.Net 2 on IIS 6. Thanks a tonne!

    Read the article

  • N-Tier Architecture - Structure with multiple projects in VB.NET

    - by focus.nz
    I would like some advice on the best approach to use in the following situation... I will have a Windows Application and a Web Application (presentation layers), these will both access a common business layer. The business layer will look at a configuration file to find the name of the dll (data layer) which it will create a reference to at runtime (is this the best approach?). The reason for creating the reference at runtime to the data access layer is because the application will interface with a different 3rd party accounting system depending on what the client is using. So I would have a separate data access layer to support each accounting system. These could be separate setup projects, each client would use one or the other, they wouldn't need to switch between the two. Projects: MyCompany.Common.dll - Contains interfaces, all other projects have a reference to this one. MyCompany.Windows.dll - Windows Forms Project, references MyCompany.Business.dll MyCompany.Web.dll - Website project, references MyCompany.Business.dll MyCompany.Busniess.dll - Business Layer, references MyCompany.Data.* (at runtime) MyCompany.Data.AccountingSys1.dll - Data layer for accounting system 1 MyCompany.Data.AccountingSys2.dll - Data layer for accounting system 2 The project MyCompany.Common.dll would contain all the interfaces, each other project would have a reference to this one. Public Interface ICompany ReadOnly Property Id() as Integer Property Name() as String Sub Save() End Interface Public Interface ICompanyFactory Function CreateCompany() as ICompany End Interface The project MyCompany.Data.AccountingSys1.dll and MyCompany.Data.AccountingSys2.dll would contain the classes like the following: Public Class Company Implements ICompany Protected _id As Integer Protected _name As String Public ReadOnly Property Id As Integer Implements MyCompany.Common.ICompany.Id Get Return _id End Get End Property Public Property Name As String Implements MyCompany.Common.ICompany.Name Get Return _name End Get Set(ByVal value as String) _name = value End Set End Property Public Sub Save() Implements MyCompany.Common.ICompany.Save Throw New NotImplementedException() End Sub End Class Public Class CompanyFactory Implements ICompanyFactory Public Function CreateCompany() As ICompany Implements MyCompany.Common.ICompanyFactory.CreateCompany Return New Company() End Function End Class The project MyCompany.Business.dll would provide the business rules and retrieve data form the data layer: Public Class Companies Public Shared Function CreateCompany() As ICompany Dim factory as New MyCompany.Data.CompanyFactory Return factory.CreateCompany() End Function End Class Any opinions/suggestions would be greatly appreciated.

    Read the article

  • Asp.Net Program Architecture

    - by Pino
    I've just taken on a new Asp.Net MVC application and after opening it up I find the following, [Project].Web [Project].Models [Project].BLL [Project].DAL Now, something thats become clear is that there is the data has to do a hell of a lot before it makes it to the View (DatabaseDALRepoBLLConvertToModelControllerView). The DAL is Subsonic, the repositorys in the DAL return the subsonic entities to the BLL which process them does crazy things and converts them into a Model (From the .Models) sometimes with classes that look like this public DataModel GetDataModel(EntityObject Src) { var ReturnData = new DataModel(): ReturnData.ID = Src.ID; ReturnDate.Name = Src.Name; //etc etc } Now, the question is, "Is this complete overkill"? Ok the project is of a decent size and can only get bigger but is it worth carrying on with all this? I dont want to use AutoMapper as it just seems like it makes the complication worse. Can anyone shed any light on this?

    Read the article

  • Architecture of an image hosting site

    - by kamziro
    I'm sure many here are aware of image hosting sites, like imgur, min.us, photobucket etc. Not that I want to develop one, but besides just uploading the file, organising it in some directory somewhere, what architectural considerations are involved in these sites? Especially when there's millions of page views a day (like imgur, I'd imagine) I'm curious about this because it seems that a lot of sites (say, dating websites etc) would be pretty image intensive. Even if it's not for millions of page views, what are some basic architectural requirements of efficient image deliveries online?

    Read the article

  • c# (wcf) architecture file and directory structure (and instantiation)

    - by stevenrosscampbell
    Hello and thanks for any assistance. I have a wcf service that I'm trying to properly modularize. I'm interested in finding out if there is a better way or implementing the file and directory structure along with instanciatation, is there a more appropriate way of abstraction that I may be missing? Is this the best approach? especially if performance and the ability to handle thousands of simultanious request? Currently I have this following structure: -Root\Service.cs public class Service : IService { public void CreateCustomer(Customer customer) { CustomerService customerService = new CustomerService(); customerService.Create(customer); } public void UpdateCustomer(Customer customer) { CustomerService customerService = new CustomerService(); customerService.Update(customer); } } -Root\Customer\CustomerService.cs pulbic class CustomerService { public void Create(Customer customer) { //DO SOMETHING } public void Update(Customer customer) { //DO SOMETHING } public void Delete(int customerId) { //DO SOMETHING } public Customer Retrieve(int customerId) { //DO SOMETHING } } Note: I do not include the Customer Object or the DataAccess libraries in this example as I am only concerned about the service. If you could either let me know what you think, if you know a better way, or a resource that could help out. Thanks Kindly. Steven

    Read the article

  • Reporting System architecture for better performance

    - by pauloya
    Hi, We have a product that runs Sql Server Express 2005 and uses mainly ASP.NET. The database has around 200 tables, with a few (4 or 5) that can grow from 300 to 5000 rows per day and keep a history of 5 years, so they can grow to have 10 million rows. We have built a reporting platform, that allows customers to build reports based on templates, fields and filters. We face performance problems almost since the beginning, we try to keep reports display under 10 seconds but some of them go up to 25 seconds (specially on those customers with long history). We keep checking indexes and trying to improve the queries but we get the feeling that there's only so much we can do. Off course the fact that the queries are generated dynamically doesn't help with the optimization. We also added a few tables that keep redundant data, but then we have the added problem of maintaining this data up to date, and also Sql Express has a limit on the size of databases. We are now facing a point where we have to decide if we want to give up real time reports, or maybe cut the history to be able to have better performance. I would like to ask what is the recommended approach for this kind of systems. Also, should we start looking for third party tools/platforms? I know OLAP can be an option but can we make it work on Sql Server Express, or at least with a license that is cheap enough to distribute to thousands of deployments? Thanks

    Read the article

  • Project Architecture For Enhancing Legacy project.

    - by vijay.shad
    I am working on legacy project. Now the situation demands project to be divided into parts. What strategy I should follow to do this task. Description: The legacy project (A) is fully functional web application with almost well defined layers. But now i need to extend the project to a further enhancement. This project usage maven as build tool. But it is used only for dependency managements only. (project exported to war form inside eclipse). The new enhancement needs me to add new data table, new UI(jsp, css, js and images). What should be my strategy to enhance to application. My proposed design. I am planing to create two new projects Project B : Main Enhancement works will done in this project. Will have all layers like service layer, dao layer and UI layer in itself. And will be a web application in itself. Project C : Extract some common model and service code form project-A and create this project. This project will be added as dependency to both the projects. If my this approach is okay! Then i presume there will be problem be problem in deployment. These two projects will demand to deploy separately(currently tomcat is used). But I must deploy these two projects as one war. So, i need to have a plan to change the web.xml entries to have configurations for both projects. This will comes with some more complexities with project. What should be my design for the project? Does my plan sounds good.

    Read the article

  • what architecture for implementing a richtext editor?

    - by genesys
    Hi! Can someone give me some hints on how a clean implementation (designwise) of a richtext editor could look like that allows for things like setting fonts, setting character colors and so on? And when and how are characters rendered? are characters rendered only once and the bitmap representation is cached? Is there any article or book covering what software design would be appropriate for that? background is that we're working on a text editing software for a language that cannot be displayed with unicode any hint is appreciated! thanks!

    Read the article

  • Questions about the MVC architecture

    - by ah123
    I started coding a considerably complicated web application, and it became quite a mess. So I figured I'd try to organize it in a better way. MVC seemed appropriate. I've never used MVC before, and researching about it I'm trying to consolidate a better perception of it (and my questions obviously reflect what I think I've learned so far). My questions are slightly JavaScript oriented: What object should make "AJAX" requests? The Controller or the Model? (seperation -- should the Model just store/manipulate the data, should it not care/know where the data came from, or should it be the one fetching it?) Should the Model call View functions providing them with data as arguments or should the View query (reference) the Model within itself? (seperation principles in mind, "the View shouldn't care/know where it gets the data from" -- is that correct?) In general, should the View "know" of the Model's existance, and vice-versa? Is the Controller the only thing gluing them together or is that simply incorrect? (I really doubt that statement is generally correct) There's a good chance I'd want to port this into a desktop/mobile application, so I would like to seperate components in a way that will allow me to achieve that task, replacing the current source of the data, HTTP requests, with DB access, and replacing the View. Maybe every approach that I've asked about is still "valid" MVC and it's just up to me to choose. I understand that nothing is set in stone, I'm just trying to have a (better) general idea in my head.

    Read the article

  • Getting started with modern software architecture and design using a book

    - by bitbonk
    I am a rather oldschool developer with some basic knowledge of software design principles and a good background on classic (gof) design patterns. While I continue my life as such I see lots of strange buzzwords emerge: Aspectoriented Design, Componentoriented Design, Domain Driven Design, Domain Specific Languages, Serviceoriented (SOA) Design, Test Driven Design, Extreme Programming, Agile Development, Continuous Integration, Dependency Injection, Software Factories ... Is there good book around that I can take with me on a roadtrip while it is taking me on a trip through all (most) of the above, delivering an 10,000 foot view on modern software archiceture and desing principles and approaches.

    Read the article

  • Architecture Guidance Needed?

    - by vijay
    We are about to automate number of process for our reporting team. (The reports are like daily reports, weekly reports, monthly reports, etc..) Mostly the process is like pulling some data from the oracle and then fill them in particular excel template files. Each reports and so their templates are different from each other. Except the excel file manipulation, there are hardly any business logic behind these. Client wanted an integrated tool and all the automated processes are placed as menus/submenus. Right now roughly there are around 30 process waiting to be automated. And we are expecting more new reports in the next quarter. I am nowhere to near having any practical experience when comes to architecuring. Already i have been maintaining two or three systems(they are more than 4yrs old.) for this prestegious client.The possiblity of the above mentioned tool will be manintained for another 3 yrs is very likely. From my past experience i've been through the pain of implmenting change requests to the rigd & undocumented code base resulting in the break down of the system and then eventually myself. So My main and top most concern is the maintainablity. When i was searching for these i came across this link, Smart Clients Using CAB and SCSF is the above link appropriate for my requirement? Also Should i place each automated processes in separate forms under a single project, or place them in separate projects under a single solution.. Please correct me if have missed any other important information. Thx.

    Read the article

  • Objective-C Plugin Architecture Security (Mac, not iphone)

    - by Tom Dalling
    I'm possibly writing a plugin system for a Cocoa application (Mac, not iphone). A common approach is the make each plugin a bundle, then inject the bundle into the main application. I'm concerned with the security implications of doing this, as the bundle will have complete access to the Objective-C runtime. I am especially concerned with a plugin having access to the code that handles registration and serial keys. Another plugin system we are considering is based on distributed notifications. Basically, each plugin will be a separate process, and they will communicate via distributed notifications only. Is there a way to load bundles securely (e.g. sandboxing)? If not, do you see any problems with using distributed notifications? Are there any other plugin architectures that would be better?

    Read the article

  • architecture python question

    - by tom smith
    hi. creating a distributed crawling python app. it consists of a master server, and associated client apps that will run on client servers. the purpose of the client app is to run across a targeted site, to extract specific data. the clients need to go "deep" within the site, behind multiple levels of forms, so each client is specifically geared towards a given site. each client app looks something like main: parse initial url call function level1 (data1) function level1 (data) parse the url, for data1 use the required xpath to get the dom elements call the next function call level2 (data) function level2 (data2) parse the url, for data2 use the required xpath to get the dom elements call the next function call level3 function level3 (dat3) parse the url, for data3 use the required xpath to get the dom elements call the next function call level4 function level4 (data) parse the url, for data4 use the required xpath to get the dom elements at the final function.. --all the data output, and eventually returned to the server --at this point the data has elements from each function... my question: given that the number of calls that is made to the child function by the current function varies, i'm trying to figure out the best approach. each function essentialy fetches a page of content, and then parses the page using a number of different XPath expressions, combined with different regex expressions depending on the site/page. if i run a client on a single box, as a sequential process, it'll take awhile, but the load on the box is rather small. i've thought of attempting to implement the child functions as threads from the current function, but that could be a nightmare, as well as quickly bring the "box" to its knees! i've thought of breaking the app up in a manner that would allow the master to essentially pass packets to the client boxes, in a way to allow each client/function to be run directly from the master. this process requires a bit of rewrite, but it has a number of advantages. a bunch of redundancy, and speed. it would detect if a section of the process was crashing and restart from that point. but not sure if it would be any faster... i'm writing the parsing scripts in python.. so... any thoughts/comments would be appreciated... i can get into a great deal more detail, but didn't want to bore anyone!! thanks! tom

    Read the article

  • Codeigniter MVC controller architecture

    - by justinbach
    I'm building a site using CodeIgniter that largely consists of static content (although there will be a relatively small CMS backend, and there's code to handle localization/internationalization based on the domain used to access it). Typically, in a situation like this, I'd use a Pages controller that is in charge of rendering static content, but as there are a fair number of pages on the site (30+) it'd quickly end up containing lots of methods (assuming one per page). Should I break my Pages controller into multiple controllers (that perhaps inherit from it) according to different sections of the site? Should I organize methods differently in the Pages controller? What's the best practice here? Thanks! Justin

    Read the article

  • Best Practices / Patterns for Enterprise Protection/Remediation of SSNs (Social Security Numbers)

    - by Erik Neu
    I am interested in hearing about enterprise solutions for SSN handling. (I looked pretty hard for any pre-existing post on SO, including reviewing the terriffic SO automated "Related Questions" list, and did not find anything, so hopefully this is not a repeat.) First, I think it is important to enumerate the reasons systems/databases use SSNs: (note—these are reasons for de facto current state—I understand that many of them are not good reasons) Required for Interaction with External Entities. This is the most valid case—where external entities your system interfaces with require an SSN. This would typically be government, tax and financial. SSN is used to ensure system-wide uniqueness. SSN has become the default foreign key used internally within the enterprise, to perform cross-system joins. SSN is used for user authentication (e.g., log-on) The enterprise solution that seems optimum to me is to create a single SSN repository that is accessed by all applications needing to look up SSN info. This repository substitutes a globally unique, random 9-digit number (ASN) for the true SSN. I see many benefits to this approach. First of all, it is obviously highly backwards-compatible—all your systems "just" have to go through a major, synchronized, one-time data-cleansing exercise, where they replace the real SSN with the alternate ASN. Also, it is centralized, so it minimizes the scope for inspection and compliance. (Obviously, as a negative, it also creates a single point of failure.) This approach would solve issues 2 and 3, without ever requiring lookups to get the real SSN. For issue #1, authorized systems could provide an ASN, and be returned the real SSN. This would of course be done over secure connections, and the requesting systems would never persist the full SSN. Also, if the requesting system only needs the last 4 digits of the SSN, then that is all that would ever be passed. Issue #4 could be handled the same way as issue #1, though obviously the best thing would be to move away from having users supply an SSN for log-on. There are a couple of papers on this: UC Berkely: http://bit.ly/bdZPjQ Oracle Vault: bit.ly/cikbi1

    Read the article

  • .Net Architecture challenge: The Change-prone Frankestein Model

    - by SDReyes
    Good Morning SO! We've been scratching our heads with with this interesting scenario at the office, and we're anxious to hear your ideas and approaches: We have a database, whose schema is prone to changes -lets call it Prony-. (is used to store configuration parameters for embedded devices. so if the embedded devices guy need a new table, property or relationship for the model, he should be able to adapt the schema in a easy way -happens so often- ). Prony needs a web interface to create/edit its data. We have another database containing data that also need to be loaded to the devices, after making some transformations - lets call this one Oddy- (this data it's generated by an already existent administrative web application). Finally we have Tracy, a server that communicates our DBs and our embedded devices. She should to auto-adapt herself, to our dbs schema changes and serialize the data to the devices. Nice puzzle, don't think so? : ) Our current candidates: Rady: The fast Lets create some views in Prony that make the data transformation from Oddy. then use DynamicData (or some RAD tool) to create/update a simple web interface for Prony (so he can even consult the transformated data from coming from Prony : ). About Tracy, she will need to be recompiled to update her DB schema (Entity framework should work) and use Reflection to explore recursively the schema and serialize data. Cons: We would have to recompile Tracy and the Prony's web interface. What do you think of the candidate(s)? What would you do?

    Read the article

  • I like the way they Design/Architecture it but how do I implement this

    - by Rachel
    Summary: I have different components on homepage and each components shows some promotion to the user. I have Cart as one Component and depending upon content of the cart promotion are show. I have to track user online activities and send that information to Omniture for Report Generation. Now my components are loaded asynchronously basically are loaded when AjaxRequest is fired up and so there is not fix pattern or rather information on when components will appear on the webpages. Now in order to pass information to Omniture I need to call track function on $(document).(ready) and append information for each components(7 parameters are required by Omniture for each component). So in the init:config function of each component am calling Omniture and passing paramters but now no. of Omniture calls is directly proportional to no. of Components on the webpage but this is not acceptable as each call to Omniture is very expensive. Now I am looking for a way where in I can club the information about 7 parameters and than make one Call to Omniture wherein I pass those information. Points to note is that I do not know when the components are loaded and so there is no pre-defined time or no. of components that would be loaded. The thing is am calling track function when document is ready but components are loaded after call to Omniture has been made and so my question is Q: How can I collect the information for all the components and than just make one call to Omniture to send those information ? As mentioned, I do not know when the components are loaded as they are done on the Ajax Request. Hope I am able to explain my challenge and would appreciate if some one can provide from Design/Architect Solutions for the Challenge.

    Read the article

  • N-tier architecture and unit tests (using Java)

    - by Alexandre FILLATRE
    Hi there, I'd like to have your expert explanations about an architectural question. Imagine a Spring MVC webapp, with validation API (JSR 303). So for a request, I have a controller that handles the request, then passes it to the service layer, which passes to the DAO one. Here's my question. At which layer should the validation occur, and how ? My though is that the controller has to handle basic validation (are mandatory fields empty ? Is the field length ok ? etc.). Then the service layer can do some tricker stuff, that involve other objets. The DAO does no validation at all. BUT, if I want to implement some unit testing (i.e. test layers below service, not the controllers), I'll end up with unexpected behavior because some validations should have been done in the Controller layer. As we don't use it for unit testing, there is a problem. What is the best way to deal with this ? I know there is no universal answer, but your personal experience is very welcomed. Thanks a lot. Regards.

    Read the article

  • ASP.NET plug-in architecture, settings problem

    - by Xaqron
    I want to divide business layer (BLL) of an asp.net application into multiple components. Each component is a .NET class library which is compiled as a standalone DLL. These components should have their own configuration files. For example "MyNameSpace.Users.dll" contains classes about users of the website and there's a password policy to check if password length is at least x characters. When webmaster edits the config file of this DLL and set x to y then component (DLL) should use new value (y) in the future and enforce passwords to be at least y characters. I want each component as a single project and compile them separaely (and not to put all projects in a solution in Visual Studio), and put the DLLs of these libraries into the "Bin" folder of my ASP.NET application. Is it possible ? Where should I put these config files ?

    Read the article

< Previous Page | 17 18 19 20 21 22 23 24 25 26 27 28  | Next Page >