Search Results

Search found 40 results on 2 pages for 'fabiano francesconi'.

Page 1/2 | 1 2  | Next Page >

  • SQL Server Intellisense VS. Red Gate SQL Prompt

    Fabiano Amorim is hooked on today's Integrated Development Environments with built-in Intellisense, so he looked forward keenly to SQL Server 2008's native intellisense. He was disappointed at how it turned out, so turned instead to SQL Prompt. Fabiano explains why he prefers to SQL Prompt, why he reckons it fits in with the way that database developers work, and goes on to describe some of the features he'd like to see in it.

    Read the article

  • Showplan Operator of the Week - Compute Scalar

    The third part of Fabiano's mission to describe the major Showplan Operators used by SQL Server's Query Optimiser continues with the 'Compute Scalar' operator. Fabiano shows how a tweak to SQL to avoid a 'Compute Scalar' step can improve its performance.

    Read the article

  • Showplan Operator of the Week - Lazy Spool

    Continuing to illuminate the depths of SQL Server's Query Optimizer, Fabiano shines a light on the sixth major Showplan Operator on his list: the Lazy Spool. What does the Lazy Spool do that's so special, how does the Query Optimizer use it, and why is it so Lazy? Fabiano explains all...

    Read the article

  • Showplan Operator of the Week - Merge Interval

    When Fabiano agreed to undertake the epic task of describing each showplan operator, none of us quite predicted the interesting ways that the series helps to understand how the query optimizer works. With the Merge Interval, Fabiano comes up with some insights about the way that the Query optimizer handles overlapping ranges efficiently. Free trial of SQL Backup™“SQL Backup was able to cut down my backup time significantly AND achieved a 90% compression at the same time!” Joe Cheng. Download a free trial now.

    Read the article

  • Operator of the week - Assert

    - by Fabiano Amorim
    Well my friends, I was wondering how to help you in a practical way to understand execution plans. So I think I'll talk about the Showplan Operators. Showplan Operators are used by the Query Optimizer (QO) to build the query plan in order to perform a specified operation. A query plan will consist of many physical operators. The Query Optimizer uses a simple language that represents each physical operation by an operator, and each operator is represented in the graphical execution plan by an icon. I'll try to talk about one operator every week, but so as to avoid having to continue to write about these operators for years, I'll mention only of those that are more common: The first being the Assert. The Assert is used to verify a certain condition, it validates a Constraint on every row to ensure that the condition was met. If, for example, our DDL includes a check constraint which specifies only two valid values for a column, the Assert will, for every row, validate the value passed to the column to ensure that input is consistent with the check constraint. Assert  and Check Constraints: Let's see where the SQL Server uses that information in practice. Take the following T-SQL: IF OBJECT_ID('Tab1') IS NOT NULL   DROP TABLE Tab1 GO CREATE TABLE Tab1(ID Integer, Gender CHAR(1))  GO  ALTER TABLE TAB1 ADD CONSTRAINT ck_Gender_M_F CHECK(Gender IN('M','F'))  GO INSERT INTO Tab1(ID, Gender) VALUES(1,'X') GO To the command above the SQL Server has generated the following execution plan: As we can see, the execution plan uses the Assert operator to check that the inserted value doesn't violate the Check Constraint. In this specific case, the Assert applies the rule, 'if the value is different to "F" and different to "M" than return 0 otherwise returns NULL'. The Assert operator is programmed to show an error if the returned value is not NULL; in other words, the returned value is not a "M" or "F". Assert checking Foreign Keys Now let's take a look at an example where the Assert is used to validate a foreign key constraint. Suppose we have this  query: ALTER TABLE Tab1 ADD ID_Genders INT GO  IF OBJECT_ID('Tab2') IS NOT NULL   DROP TABLE Tab2 GO CREATE TABLE Tab2(ID Integer PRIMARY KEY, Gender CHAR(1))  GO  INSERT INTO Tab2(ID, Gender) VALUES(1, 'F') INSERT INTO Tab2(ID, Gender) VALUES(2, 'M') INSERT INTO Tab2(ID, Gender) VALUES(3, 'N') GO  ALTER TABLE Tab1 ADD CONSTRAINT fk_Tab2 FOREIGN KEY (ID_Genders) REFERENCES Tab2(ID) GO  INSERT INTO Tab1(ID, ID_Genders, Gender) VALUES(1, 4, 'X') Let's look at the text execution plan to see what these Assert operators were doing. To see the text execution plan just execute SET SHOWPLAN_TEXT ON before run the insert command. |--Assert(WHERE:(CASE WHEN NOT [Pass1008] AND [Expr1007] IS NULL THEN (0) ELSE NULL END))      |--Nested Loops(Left Semi Join, PASSTHRU:([Tab1].[ID_Genders] IS NULL), OUTER REFERENCES:([Tab1].[ID_Genders]), DEFINE:([Expr1007] = [PROBE VALUE]))           |--Assert(WHERE:(CASE WHEN [Tab1].[Gender]<>'F' AND [Tab1].[Gender]<>'M' THEN (0) ELSE NULL END))           |    |--Clustered Index Insert(OBJECT:([Tab1].[PK]), SET:([Tab1].[ID] = RaiseIfNullInsert([@1]),[Tab1].[ID_Genders] = [@2],[Tab1].[Gender] = [Expr1003]), DEFINE:([Expr1003]=CONVERT_IMPLICIT(char(1),[@3],0)))           |--Clustered Index Seek(OBJECT:([Tab2].[PK]), SEEK:([Tab2].[ID]=[Tab1].[ID_Genders]) ORDERED FORWARD) Here we can see the Assert operator twice, first (looking down to up in the text plan and the right to left in the graphical plan) validating the Check Constraint. The same concept showed above is used, if the exit value is "0" than keep running the query, but if NULL is returned shows an exception. The second Assert is validating the result of the Tab1 and Tab2 join. It is interesting to see the "[Expr1007] IS NULL". To understand that you need to know what this Expr1007 is, look at the Probe Value (green text) in the text plan and you will see that it is the result of the join. If the value passed to the INSERT at the column ID_Gender exists in the table Tab2, then that probe will return the join value; otherwise it will return NULL. So the Assert is checking the value of the search at the Tab2; if the value that is passed to the INSERT is not found  then Assert will show one exception. If the value passed to the column ID_Genders is NULL than the SQL can't show a exception, in that case it returns "0" and keeps running the query. If you run the INSERT above, the SQL will show an exception because of the "X" value, but if you change the "X" to "F" and run again, it will show an exception because of the value "4". If you change the value "4" to NULL, 1, 2 or 3 the insert will be executed without any error. Assert checking a SubQuery: The Assert operator is also used to check one subquery. As we know, one scalar subquery can't validly return more than one value: Sometimes, however, a  mistake happens, and a subquery attempts to return more than one value . Here the Assert comes into play by validating the condition that a scalar subquery returns just one value. Take the following query: INSERT INTO Tab1(ID_TipoSexo, Sexo) VALUES((SELECT ID_TipoSexo FROM Tab1), 'F')    INSERT INTO Tab1(ID_TipoSexo, Sexo) VALUES((SELECT ID_TipoSexo FROM Tab1), 'F')    |--Assert(WHERE:(CASE WHEN NOT [Pass1016] AND [Expr1015] IS NULL THEN (0) ELSE NULL END))        |--Nested Loops(Left Semi Join, PASSTHRU:([tempdb].[dbo].[Tab1].[ID_TipoSexo] IS NULL), OUTER REFERENCES:([tempdb].[dbo].[Tab1].[ID_TipoSexo]), DEFINE:([Expr1015] = [PROBE VALUE]))              |--Assert(WHERE:([Expr1017]))             |    |--Compute Scalar(DEFINE:([Expr1017]=CASE WHEN [tempdb].[dbo].[Tab1].[Sexo]<>'F' AND [tempdb].[dbo].[Tab1].[Sexo]<>'M' THEN (0) ELSE NULL END))              |         |--Clustered Index Insert(OBJECT:([tempdb].[dbo].[Tab1].[PK__Tab1__3214EC277097A3C8]), SET:([tempdb].[dbo].[Tab1].[ID_TipoSexo] = [Expr1008],[tempdb].[dbo].[Tab1].[Sexo] = [Expr1009],[tempdb].[dbo].[Tab1].[ID] = [Expr1003]))              |              |--Top(TOP EXPRESSION:((1)))              |                   |--Compute Scalar(DEFINE:([Expr1008]=[Expr1014], [Expr1009]='F'))              |                        |--Nested Loops(Left Outer Join)              |                             |--Compute Scalar(DEFINE:([Expr1003]=getidentity((1856985942),(2),NULL)))              |                             |    |--Constant Scan              |                             |--Assert(WHERE:(CASE WHEN [Expr1013]>(1) THEN (0) ELSE NULL END))              |                                  |--Stream Aggregate(DEFINE:([Expr1013]=Count(*), [Expr1014]=ANY([tempdb].[dbo].[Tab1].[ID_TipoSexo])))             |                                       |--Clustered Index Scan(OBJECT:([tempdb].[dbo].[Tab1].[PK__Tab1__3214EC277097A3C8]))              |--Clustered Index Seek(OBJECT:([tempdb].[dbo].[Tab2].[PK__Tab2__3214EC27755C58E5]), SEEK:([tempdb].[dbo].[Tab2].[ID]=[tempdb].[dbo].[Tab1].[ID_TipoSexo]) ORDERED FORWARD)  You can see from this text showplan that SQL Server as generated a Stream Aggregate to count how many rows the SubQuery will return, This value is then passed to the Assert which then does its job by checking its validity. Is very interesting to see that  the Query Optimizer is smart enough be able to avoid using assert operators when they are not necessary. For instance: INSERT INTO Tab1(ID_TipoSexo, Sexo) VALUES((SELECT ID_TipoSexo FROM Tab1 WHERE ID = 1), 'F') INSERT INTO Tab1(ID_TipoSexo, Sexo) VALUES((SELECT TOP 1 ID_TipoSexo FROM Tab1), 'F')  For both these INSERTs, the Query Optimiser is smart enough to know that only one row will ever be returned, so there is no need to use the Assert. Well, that's all folks, I see you next week with more "Operators". Cheers, Fabiano

    Read the article

  • Stereo images rectification and disparity: which algorithms?

    - by alessandro.francesconi
    I'm trying to figure out what are currently the two most efficent algorithms that permit, starting from a L/R pair of stereo images created using a traditional camera (so affected by some epipolar lines misalignment), to produce a pair of adjusted images plus their depth information by looking at their disparity. Actually I've found lots of papers about these two methods, like: "Computing Rectifying Homographies for Stereo Vision" (Zhang - seems one of the best for rectification only) "Three-step image recti?cation" (Monasse) "Rectification and Disparity" (slideshow by Navab) "A fast area-based stereo matching algorithm" (Di Stefano - seems a bit inaccurate) "Computing Visual Correspondence with Occlusions via Graph Cuts" (Kolmogorov - this one produces a very good disparity map, with also occlusion informations, but is it efficient?) "Dense Disparity Map Estimation Respecting Image Discontinuities" (Alvarez - toooo long for a first review) Anyone could please give me some advices for orienting into this wide topic? What kind of algorithm/method should I treat first, considering that I'll work on a very simple input: a pair of left and right images and nothing else, no more information (some papers are based on additional, pre-taken, calibration infos)? Speaking about working implementations, the only interesting results I've seen so far belongs to this piece of software, but only for automatic rectification, not disparity: http://stereo.jpn.org/eng/stphmkr/index.html I tried the "auto-adjustment" feature and seems really effective. Too bad there is no source code...

    Read the article

  • SQL Server Intellisense VS. Red Gate SQL Prompt

    Fabiano Amorim is hooked on today's Integrated Development Environments with built-in Intellisense, so he looked forward keenly to SQL Server 2008's native intellisense. He was disappointed at how it turned out, so turned instead to SQL Prompt. Fabiano explains why he prefers to SQL Prompt, why he reckons it fits in with the way that database developers work, and goes on to describe some of the features he'd like to see in it SQL Server monitoring made easy "Keeping an eye on our many SQL Server instances is much easier with SQL Response." Mike Lile.Download a free trial of SQL Response now.

    Read the article

  • How to handle correctly HTTP Digest Authentication on iPhone

    - by Fabiano Francesconi
    Hello guys. I'm trying to upload a file onto my personal server. I've written a small php page that works flawlessy so far. The little weird thing is the fact that I generate all the body of the HTTP message I'm going to send (let's say that amounts to ~4 mb) and then I send the request to my server. The server, then, asks for an HTTP challenge and my delegate connection:didReceiveAuthenticationChallenge:challenge replies to the server with the proper credentials and the data. But, what's happened? The data has been sent twice! In fact I've noticed that when I added the progressbar.. the apps sends the data (4mb), the server asks for authentication, the apps re-sends the data with the authentication (another 4mb). So, at the end, I've sent 8mb. That's wrong. I started googling and searching for a solution but I can't figure out how to fix this. The case scenarios are two (my guess): Share the realm for the whole session (a minimal HTTP request, then challenge, then data) Use the synchronized way to perform an HTTP connection (things that I do not want to do since it seems an ugly way to handle this kind of stuff to me) Thank you

    Read the article

  • Find a process by name and kill it

    - by Fabiano PS
    So, I want to send a kill to a process, I know it's name ps -ef | grep '_rails master' root 2388 1 0 19:46 ? 00:00:04 unicorn_rails master -c /web/hero/config/unicorn.rb -E production -D root 2582 2172 0 20:28 pts/0 00:00:00 grep --color=auto _rails master It is unicorn_rails master [..] how do I kill it? I tried so far: sed and expr. But cant pass it as param to kill

    Read the article

  • A query for date within a year

    - by Fabiano PS
    My table is like this on Postgres, note that all days start by 01, there is only 1 entry a month+year SELECT * FROM "fis_historico_receita" +----+------------+---------------+ | id | data | receita_bruta | +----+------------+---------------+ | 1 | 2010-02-01 | 100000.0 | | 2 | 2010-01-01 | 100000.0 | | 3 | 2009-12-01 | 100000.0 | | 4 | 2009-11-01 | 100000.0 | | 5 | 2009-10-01 | 100000.0 | | 6 | 2009-09-01 | 100000.0 | | 7 | 2009-08-01 | 100000.0 | | 8 | 2009-07-01 | 100000.0 | | 9 | 2009-06-01 | 100000.0 | | 10 | 2009-05-01 | 100000.0 | | 11 | 2009-04-01 | 100000.0 | | 12 | 2009-03-01 | 100000.0 | | 13 | 2009-02-01 | 100000.0 | | 14 | 2009-01-01 | 100000.0 | | 15 | 2008-12-01 | 100000.0 | +----+------------+---------------+ What I want is to find 12 months starting right from before the current. I tried this: select * from fis_historico_receita where data in interval '1 year' I really would like an answer using Interval, +1 goes for everyone that runs on Postgres

    Read the article

  • How to trace WCF serialization issues / exceptions

    - by Fabiano
    Hi I occasionally run into the problem that an application exception is thrown during the WCF-serialization (after returning a DataContract from my OperationContract). The only (and less meaningfull) message I get is System.ServiceModel.CommunicationException : The underlying connection was closed: The connection was closed unexpectedly. without any insight to the inner exception, which makes it really hard to find out what caused the error during serialization. Does someone know a good way how you can trace, log and debug these exceptions? Or even better can I catch the exception, handle them and send a defined FaulMessage to the client? thank you

    Read the article

  • Ruby String accent error: More than meet the eyes

    - by Fabiano PS
    I'm having a real trouble to get accents right, and I believe this may happen to most Latin languages, in my case, portuguese I have a string that come as parameter and I must get the first letter and upcase it! That should be trivial in ruby, but here is the catch: s1 = 'alow'; s1.size #=> 4 s2 = 'álow'; s2.size #=> 5 s1[0,1] #=> "a" s2[0,1] #=> "\303" s1[0,1].upcase #=> 'A' s2[0,1].upcase #=> '\303' !!! s1[0,1].upcase + s1[1,100] #=> "Alow" OK s2[0,1].upcase + s2[1,100] #=> "álow" NOT OK I'd like to make it generic, Any help? [EDIT] I found that Rails Strings can be cast to Multibytes as seen in class ../active_support/core_ext/string/multibyte.rb, just using: s2.mb_chars[0,1].upcase.to_s #=> "Á" Still, @nsdk approach is easier to use =)

    Read the article

  • WCF: Serializing and Deserializing generic collections

    - by Fabiano
    I have a class Team that holds a generic list: [DataContract(Name = "TeamDTO", IsReference = true)] public class Team { [DataMember] private IList<Person> members = new List<Person>(); public Team() { Init(); } private void Init() { members = new List<Person>(); } [System.Runtime.Serialization.OnDeserializing] protected void OnDeserializing(StreamingContext ctx) { Log("OnDeserializing of Team called"); Init(); if (members != null) Log(members.ToString()); } [System.Runtime.Serialization.OnSerializing] private void OnSerializing(StreamingContext ctx) { Log("OnSerializing of Team called"); if (members != null) Log(members.ToString()); } [System.Runtime.Serialization.OnDeserialized] protected void OnDeserialized(StreamingContext ctx) { Log("OnDeserialized of Team called"); if (members != null) Log(members.ToString()); } [System.Runtime.Serialization.OnSerialized] private void OnSerialized(StreamingContext ctx) { Log("OnSerialized of Team called"); Log(members.ToString()); } When I use this class in a WCF service, I get following log output OnSerializing of Team called System.Collections.Generic.List 1[Person] OnSerialized of Team called System.Collections.Generic.List 1[Person] OnDeserializing of Team called System.Collections.Generic.List 1[ENetLogic.ENetPerson.Model.FirstPartyPerson] OnDeserialized of Team called ENetLogic.ENetPerson.Model.Person[] After the deserialization members is an Array and no longer a generic list although the field type is IList< (?!) When I try to send this object back over the WCF service I get the log output OnSerializing of Team called ENetLogic.ENetPerson.Model.FirstPartyPerson[] After this my unit test crashes with a System.ExecutionEngineException, which means the WCF service is not able to serialize the array. (maybe because it expected a IList<) So, my question is: Does anybody know why the type of my IList< is an array after deserializing and why I can't serialize my Team object any longer after that? Thanks

    Read the article

  • Breaking a captcha

    - by Fabiano PS
    What I'm looking for is a way to break this captcha: https://www.nfe.fazenda.gov.br/PORTAL/FormularioDePesquisa.aspx?tipoConsulta=completa Notice that the image alternates based on 2 types of images. Anyone knows a OCR (optical character recognition) 'damm good'for this? Other solutions also accepted (except proxing). I'm not looking for spamming, I just want to automatize my Software instead of proxing the image for the user. Nowhere in the site says it is forbidden to automatically parse the image.

    Read the article

  • performance of linq extension method ElementAt

    - by Fabiano
    Hi The MSDN library entry to Enumerable.ElementAt(TSource) Method says "If the type of source implements IList, that implementation is used to obtain the element at the specified index. Otherwise, this method obtains the specified element." Let's say we have following example: ICollection<int> col = new List<int>() { /* fill with items */ }; IList<int> list = new List<int>() { /* fill with items */ }; col.ElementAt(10000000); list.ElementAt(10000000); Is there any difference in execution? or does ElementAt recognize that col also implements IList< although it's only declared as ICollection<? Thanks

    Read the article

  • Business Logic Layer Pattern on Rails? MVCL

    - by Fabiano PS
    That is a broad question, and I appreciate no short/dumb asnwers like: "Oh that is the model job, this quest is retarded (period)" PROBLEM Where I work at people created a system over 2 years for managing the manufacture process over demand in the most simplified still broad as possible, involving selling, buying, assemble, The system is coded over Ruby On Rails. The app has been changed lots of times and the result is a mess on callbacks (some are called several times), 200+ models, and fat controllers: Total bad. The QUESTION is, if there is a gem, or pattern designed to handle Rails large app logic? The logic whould be able to fully talk to models (whose only concern would be data format handling and validation) What I EXPECT is to reduce complexity from various controllers, and hard to track callbacks into files with the responsibility to handle a business operation logic. In some cases there is the need to wait for a response, in others, only validation of the input is enough and a bg process would take place. ie: -- Sell some products (need to wait the operation to finish) 1. Set a View able to get the products input 2. Controller gets the product list inputed by employee and call the logic Logic::ExecuteWithResponse('sell', 'products', :prods => @product_list_with_qtt, :when => @date, :employee => current_user() ) This Logic would handle buying order, assemble order, machine schedule, warehouse reservation, and others. Have in mind that a callback on SalesOrder is not enough, since it depends on where it is called (no field for that), depends on the class of the user, among other stuff not visible for the model, or in some cases it would take long for the model to process.

    Read the article

  • Rails has_many with dinamyc :conditions

    - by Fabiano PS
    Hello! What I want is to create a Model that connects with other by a has_many in a dinamic way, without the foreing_key like this: has_many :faixas_aliquotas, :class_name => 'Fiscal::FaixaAliquota', :conditions => ["regra_fiscal = ?", ( lambda { return self.regra_fiscal } ) ] But I get the error: : SELECT * FROM "fis_faixa_aliquota" WHERE ("fis_faixa_aliquota".situacao_fiscal_id = 1 AND (regra_fiscal = E'--- !ruby/object:Proc {}')) Is my will makeable? I don't wanna black magic solution..

    Read the article

  • Rails has_many with dynamic conditions

    - by Fabiano PS
    Hello! What I want is to create a Model that connects with another using a has_many association in a dynamic way, without the foreign key like this: has_many :faixas_aliquotas, :class_name => 'Fiscal::FaixaAliquota', :conditions => ["regra_fiscal = ?", ( lambda { return self.regra_fiscal } ) ] But I get the error: : SELECT * FROM "fis_faixa_aliquota" WHERE ("fis_faixa_aliquota".situacao_fiscal_id = 1 AND (regra_fiscal = E'--- !ruby/object:Proc {}')) Is this possible?

    Read the article

  • Regular expression that contains in it...

    - by Fabiano PS
    I need my regexp to match, all the highlighted words, in order, following: Fri/Feb/10 - *Nesta* manh\303\203\302\243,* @*anestesya *entrou* \303\240s 08:18AM Fri/Feb/10 - *Nesta* tarde,* @*pernas *saiu* \303\240s 08:18AM I was trying something like: /[?=Nesta.?=@.?=(entrou|saiu)]/ So I don't care what is in between as long as it has: ' Nesta ' AND ' @' AND ' entrou ' OR ' saiu '

    Read the article

  • how to test Asp.Net frontend?

    - by Fabiano
    Hi Does someone know a good way to automate the gui testing based on an Asp.Net frontend? (instead of always run the pages and test the inputs and outputs of a control by hand) Are there any references or framework to support these tests? Thanks

    Read the article

  • Execute a Rake task from within migration?

    - by Fabiano PS
    I have a Rake task that loads configuration data into the DB from a file, is there a correct ruby/rails way to call it on a migration up? My objective is to sync my team DB configs, without have to broadcast then to run the migration lalala def self.up change_table :fis_situacao_fiscal do |t| t.remove :mostrar_endereco t.rename :serie, :modelo end Faturamento::Cfop.destroy_all() #perform rake here ! end btw: Admins could clean up some tags? there is 'migrations' and 'migration', same as 'ruby-on-rails' and 'rails'

    Read the article

  • in which namespace / package to put exceptions

    - by Fabiano
    Hi What is the common or best practice to structure the location of your exception classes? Let's say you have the packages/namespaces myproject.person (models and DAOs for persons) and myproject.order (models and DAOs for orders) and the exceptions PersonException and OrderException. Should I put the exceptions in their corresponding packages or in a separate package for exceptions (e.g. myproject.exceptions)? The first approach seems more reasonable (because it's sorted by functionality). But there the question arises where you should put exceptions that are related to both? e.g. a ConstraintViolationException Thanks

    Read the article

  • NOT IN statement for Visual Studio's Query Builder for TableAdapter

    - by Fabiano
    Hi I want to realize a query with the Visual Studio 2008 build in Query Builder for a TableAdapter similar like following (MSSQL 2008): select * from [MyDB].[dbo].[MyView] where UNIQUE_ID NOT IN ('MyUniqueID1','MyUniqueID2') How do I have to set the Filter in my query in order to call it with the myTableAdapter.GetDataExceptUniqueIds(...) function? I tried to set the filter to NOT IN (@ids) and called it with string[] uniqueIds = ...; myTableAdapter.GetDataExceptUniqueIds(String.Join("','", uniqueIds)); and with StringBuilder sb = new StringBuilder("'"); sb.Append(String.Join("','", uniqueIds)); sb.Append("'"); return myTableAdapter.GetDataExceptUniqueIds(sb.ToString()); but both failed

    Read the article

  • Business Layer Pattern on Rails? MVCL

    - by Fabiano PS
    That is a broad question, and I appreciate no short/dumb asnwers like: "Oh that is the model job, this quest is retarded (period)" PROBLEM Where I work at people created a system over 2 years for managing the manufacture process over demand in the most simplified still broad as possible, involving selling, buying, assemble, The system is coded over Ruby On Rails. The result has been changed lots of times and the result is a mess on callbacks (some are called several times), 200+ models, and fat controllers: Total bad. The QUESTION is, if there is a gem, or pattern designed to handle Rails large app logic? The logic whould be able to fully talk to models (whose only concern would be data format handling and validation) What I EXPECT is to reduce complexity from various controllers, and hard to track callbacks into files with the responsibility to handle a business operation logic. In some cases there is the need to wait for a response, in others, only validation of the input is enough and a bg process would take place. ie: -- Sell some products (need to wait the operation to finish) 1. Set a View able to get the products input 2. Controller gets the product list inputed by employee and call the logic Logic::ExecuteWithResponse('sell', 'products', :prods => @product_list_with_qtt, :when => @date, :employee => current_user() ) This Logic would handle buying order, assemble order, machine schedule, warehouse reservation, and others

    Read the article

1 2  | Next Page >