Search Results

Search found 2628 results on 106 pages for 'lexical analysis'.

Page 20/106 | < Previous Page | 16 17 18 19 20 21 22 23 24 25 26 27  | Next Page >

  • Why is code quality not popular?

    - by Peter Kofler
    I like my code being in order, i.e. properly formatted, readable, designed, tested, checked for bugs, etc. In fact I am fanatic about it. (Maybe even more than fanatic...) But in my experience actions helping code quality are hardly implemented. (By code quality I mean the quality of the code you produce day to day. The whole topic of software quality with development processes and such is much broader and not the scope of this question.) Code quality does not seem popular. Some examples from my experience include Probably every Java developer knows JUnit, almost all languages implement xUnit frameworks, but in all companies I know, only very few proper unit tests existed (if at all). I know that it's not always possible to write unit tests due to technical limitations or pressing deadlines, but in the cases I saw, unit testing would have been an option. If a developer wanted to write some tests for his/her new code, he/she could do so. My conclusion is that developers do not want to write tests. Static code analysis is often played around in small projects, but not really used to enforce coding conventions or find possible errors in enterprise projects. Usually even compiler warnings like potential null pointer access are ignored. Conference speakers and magazines would talk a lot about EJB3.1, OSGI, Cloud and other new technologies, but hardly about new testing technologies or tools, new static code analysis approaches (e.g. SAT solving), development processes helping to maintain higher quality, how some nasty beast of legacy code was brought under test, ... (I did not attend many conferences and it propably looks different for conferences on agile topics, as unit testing and CI and such has a higer value there.) So why is code quality so unpopular/considered boring? EDIT: Thank your for your answers. Most of them concern unit testing (and has been discussed in a related question). But there are lots of other things that can be used to keep code quality high (see related question). Even if you are not able to use unit tests, you could use a daily build, add some static code analysis to your IDE or development process, try pair programming or enforce reviews of critical code.

    Read the article

  • /analyze flag in Visual Studio 2010 Professional

    - by Martin
    Running Visual Studio 2008 Professional it is possible to enable static code analysis using the /analyze flag (even though this is not supported for the Professional version according to the documentation). In Visual Studio 2010 Professional this no longer works. Instead there is a default /analyze- flag added (one I can't find a GUI setting for). This does not work as well as the VS2008 version (or at all). Can anyone shed some light into this? What does the new /analyze- flag do and is there any way to enable the old analysis?

    Read the article

  • EXTEND_MODEL_CASES SQL 2005 workaround

    - by user282382
    Hi, I have a time series based mining model in SQL 2005 Analysis Serveries. I understand in 2008 you can do what if analysis by using EXTEND_MODEL_CASES with a Natural Prediction Join. I'm looking for a workaround or some method of doing the same thing but with 2005. My time series has 3 inputs, and one predict_only. I'd like to use the prediction function to see what types of prediction it makes for 6-12 time intervals in the future with inputs in a separate table. Is there any way to do this or something similar? Thanks

    Read the article

  • assistance with fxcop

    - by amateur
    I am at present developing a mvc4 project that comunicates to a set of wcf services. I am setting such up in tfs build for a team of developers. I am very much a newbie to fxcop and code analysis in general. I am currently researching it and have some questions following this: Is it recommended to use the rules that come with fxcop? Should it be included as a build task during builds? What is the value from it? Are there guidelines to what rules to abide by or is it best to go with the default? Is it correct to run the analysis as a post build event? I am a newbie to fxcop and would like some feedback. I am as it is integrating stylecop in to my build.

    Read the article

  • Query Execution Failed in Reporting Services reports

    - by Chris Herring
    I have some reporting services reports that talk to Analysis Services and at times they fail with the following error: An error occurred during client rendering. An error has occurred during report processing. Query execution failed for dataset 'AccountManagerAccountManager'. The connection cannot be used while an XmlReader object is open. This occurs sometimes when I change selections in the filter. It also occurs when the machine has been under heavy load and then will consistently error until SSAS is restarted. The log file contains the following error: processing!ReportServer_0-18!738!04/06/2010-11:01:14:: e ERROR: Throwing Microsoft.ReportingServices.ReportProcessing.ReportProcessingException: Query execution failed for dataset 'AccountManagerAccountManager'., ; Info: Microsoft.ReportingServices.ReportProcessing.ReportProcessingException: Query execution failed for dataset 'AccountManagerAccountManager'. ---> System.InvalidOperationException: The connection cannot be used while an XmlReader object is open. at Microsoft.AnalysisServices.AdomdClient.XmlaClient.CheckConnection() at Microsoft.AnalysisServices.AdomdClient.XmlaClient.ExecuteStatement(String statement, IDictionary connectionProperties, IDictionary commandProperties, IDataParameterCollection parameters, Boolean isMdx) at Microsoft.AnalysisServices.AdomdClient.AdomdConnection.XmlaClientProvider.Microsoft.AnalysisServices.AdomdClient.IExecuteProvider.ExecuteTabular(CommandBehavior behavior, ICommandContentProvider contentProvider, AdomdPropertyCollection commandProperties, IDataParameterCollection parameters) at Microsoft.AnalysisServices.AdomdClient.AdomdCommand.ExecuteReader(CommandBehavior behavior) at Microsoft.AnalysisServices.AdomdClient.AdomdCommand.System.Data.IDbCommand.ExecuteReader(CommandBehavior behavior) at Microsoft.ReportingServices.DataExtensions.AdoMdCommand.ExecuteReader(CommandBehavior behavior) at Microsoft.ReportingServices.OnDemandProcessing.RuntimeDataSet.RunDataSetQuery() Can anyone shed light on this issue?

    Read the article

  • schedule backup and restore of SSAS 2008 database

    - by Manjot
    Hi, I can backup and restore databases on Microsoft SQL server Analysis Service 2008 using GUI as from Backup SSAS I want to schedule backup and restore it to another server every night. so what i did is : I scripted out the backup and restore process from the GUI. Created a new SQL server agent job in database engine and added a "Run SSAS query" step. Copied the scripts to this step. But it fails. the scripts that the GUI copied out look like: <Backup xmlns="http://schemas.microsoft.com/analysisservices/2003/engine"> <Object> <DatabaseID>DB</DatabaseID> </Object> <File>C:\Backup\DB.abf</File> <AllowOverwrite>true</AllowOverwrite> </Backup> <Restore xmlns="http://schemas.microsoft.com/analysisservices/2003/engine"> <File>\\server\C$\Backup\DB.abf</File> <DatabaseName>DB</DatabaseName> <AllowOverwrite>true</AllowOverwrite> </Restore> Any help please?

    Read the article

  • Query Execution Failed in Reporting Services reports

    - by Chris Herring
    I have some reporting services reports that talk to Analysis Services and at times they fail with the following error: An error occurred during client rendering. An error has occurred during report processing. Query execution failed for dataset 'AccountManagerAccountManager'. The connection cannot be used while an XmlReader object is open. This occurs sometimes when I change selections in the filter. It also occurs when the machine has been under heavy load and then will consistently error until SSAS is restarted. The log file contains the following error: processing!ReportServer_0-18!738!04/06/2010-11:01:14:: e ERROR: Throwing Microsoft.ReportingServices.ReportProcessing.ReportProcessingException: Query execution failed for dataset 'AccountManagerAccountManager'., ; Info: Microsoft.ReportingServices.ReportProcessing.ReportProcessingException: Query execution failed for dataset 'AccountManagerAccountManager'. ---> System.InvalidOperationException: The connection cannot be used while an XmlReader object is open. at Microsoft.AnalysisServices.AdomdClient.XmlaClient.CheckConnection() at Microsoft.AnalysisServices.AdomdClient.XmlaClient.ExecuteStatement(String statement, IDictionary connectionProperties, IDictionary commandProperties, IDataParameterCollection parameters, Boolean isMdx) at Microsoft.AnalysisServices.AdomdClient.AdomdConnection.XmlaClientProvider.Microsoft.AnalysisServices.AdomdClient.IExecuteProvider.ExecuteTabular(CommandBehavior behavior, ICommandContentProvider contentProvider, AdomdPropertyCollection commandProperties, IDataParameterCollection parameters) at Microsoft.AnalysisServices.AdomdClient.AdomdCommand.ExecuteReader(CommandBehavior behavior) at Microsoft.AnalysisServices.AdomdClient.AdomdCommand.System.Data.IDbCommand.ExecuteReader(CommandBehavior behavior) at Microsoft.ReportingServices.DataExtensions.AdoMdCommand.ExecuteReader(CommandBehavior behavior) at Microsoft.ReportingServices.OnDemandProcessing.RuntimeDataSet.RunDataSetQuery() Can anyone shed light on this issue?

    Read the article

  • What is a name for a job where you do system analysis, project management and data diagramming?

    - by David Archer
    In the last 4 months I've been able to manage a team and step away from the coding for a bit. I've been planning the system in full (both System Analysis and project managing, alongside action and data diagramming) writing the technical documentation, the code's architecture, keeping track of the other guys doing the actual coding, QA, bug reports and dealing with clients. I had to take two days' training on node.js just to see if it would be suitable for a project we were considering. Is there a name for this job? Project Manager and Systems Architect don't quite seem to have the same stuff, and IT manager seems way off. I only want to know so that I can get some qualification towards it and try to move into this kind of work full-time.

    Read the article

  • Where do I find scripts generated by SharePoint MCMS Migration Profiles

    - by HipCzeck
    I am attempting to migrate data from an Microsoft Content Management Server (MCMS) 2002 instance into a new Microsoft Office Sharepoint Server (MOSS) 2007 installation using the Manage Microsoft Content Management Server Migration Profiles tool in the Operations space of MOSS Central Administration. When analyzing the profile, I receive 4 warnings, all of which may be safely ignored, but when I actually execute the migration profile, I get the same warnings and an additional error with a description of: Line 6: Incorrect syntax near ';'. I have seen this error numerous times when mucking about in SQL Server and recognize it as a Transact SQL error message, but can't find the actual SQL statement that is being executed so that I may determine the source of the error. EDIT: After enabling verbose logging on the MCMS 2002 Migration category, and poring through the Unified Logging Service (ULS) logs, I received a more complete stack trace at the point of the error, and a couple more anomalies listed below. Anomalies: The following is an abbreviated listing from the ULS logs around the time of the pre-migration analysis. 01 MCMS 2002 Migration Verbose Start ConnectionCheck 02 MCMS 2002 Migration Verbose End ConnectionCheck 03 MCMS 2002 Migration Verbose Start DatabaseCheck 04 MCMS 2002 Migration High Extra table SiteDeployLock will not be migrated 05 MCMS 2002 Migration High Analysis: Extra index PK__SiteDeployLock__05D8E0BE 06 MCMS 2002 Migration Verbose End DatabaseCheck 07 MCMS 2002 Migration Medium Pre-migration analysis: RootCheckTask is skipped because database check is blocked. 08 MCMS 2002 Migration Medium Pre-migration analysis: RightsGroupNameCheckTask is skipped because database check is blocked. 09 MCMS 2002 Migration Medium Pre-migration analysis: InvalidNameCheckTask is skipped because database check is blocked. 10 MCMS 2002 Migration Medium Pre-migration analysis: LeafNameCheckTask is skipped because database check is blocked. 11 MCMS 2002 Migration Medium Pre-migration analysis: LeafLengthCheckTask is skipped because database check is blocked. 12 MCMS 2002 Migration Medium Pre-migration analysis: TemplateNameCheckTask is skipped because database check is blocked. 13 MCMS 2002 Migration Medium Pre-migration analysis: TemplateCollisionCheckTask is skipped because database check is blocked. 14 MCMS 2002 Migration Medium Pre-migration analysis: PlaceholderCheckTask is skipped because database check is blocked. 15 MCMS 2002 Migration Medium Pre-migration analysis: CheckedOutItemsCheckTask is skipped because database check is blocked. 16 MCMS 2002 Migration Medium Pre-migration analysis: SubmittedItemsCheckTask is skipped because database check is blocked. 17 MCMS 2002 Migration Medium Pre-migration analysis: DeletedItemsCheckTask is skipped because database check is blocked. 18 MCMS 2002 Migration Medium Pre-migration analysis: UserCheckTask is skipped because database check is blocked. 19 MCMS 2002 Migration Medium Pre-migration analysis: FileSizeCheckTask is skipped because database check is blocked. 20 MCMS 2002 Migration Medium Pre-migration analysis: HostHeaderMapCheckTask is skipped because database check is blocked. 21 MCMS 2002 Migration Verbose Start Server check 22 MCMS 2002 Migration Verbose End Server check 23 MCMS 2002 Migration Verbose Start Server emptyness check 24 MCMS 2002 Migration Verbose End Server emptyness check 25 MCMS 2002 Migration Medium PreMigrationAnalyzer: Dry run starts 26 MCMS 2002 Migration Verbose CleanLockProcedure: start. 27 MCMS 2002 Migration High CleanLockProcedure: connection system lock is null 28 MCMS 2002 Migration Verbose Finished all tasks 29 MCMS 2002 Migration High PreMigrationAnalyzer ends with True 30 MCMS 2002 Migration Verbose Migration profile status is changed to AnalysisPassed Specifically, the two High level alerts on lines 4 and 5 are reflected in the migration report as warnings when running Pre-migration Analysis or running the migration profile. In addition, two other warnings appear in the migration report indicating two tables containing data (LayoutProperty and NodeLayout) that should be empty. According to the documentation, warnings are not sufficient cause to stop migration from occurring. Other anomalies are on lines 7-20 indicating a series of tests that are skipped because database check is blocked. The ULS doesn't give any additional warnings to indicate that the database check was blocked or exited in exceptional circumstances. After switching the profile from pre-migration analysis to exporting, there is one medium level warning that LastChangeTime is not set or incorrect. (null). As with all the skipped test names and SQL table names from the warnings, the major search engines are unable (with the exception of LayoutProperty) to find any reference to these objects or tests. Finally, the section of the log indicating the actual live migration attempt is appended below: 01 MCMS 2002 Migration Medium LastChangeTime is not set or incorrect. (null) 02 MCMS 2002 Migration Verbose Set export lock 03 MCMS 2002 Migration Verbose CleanLockProcedure: start. 04 MCMS 2002 Migration Verbose CleanLockProcedure: end. 05 MCMS 2002 Migration Verbose Prepare for export 06 MCMS 2002 Migration Verbose Open connection... 07 MCMS 2002 Migration Verbose Create temporary stored procedures 08 MCMS 2002 Migration Verbose Create temporary tables... 09 MCMS 2002 Migration Verbose Initialize temporary tables... 10 MCMS 2002 Migration Verbose InitializeTemporaryTables: start 11 MCMS 2002 Migration Verbose Initialize export table... 12 MCMS 2002 Migration Verbose InitializeExportTable: start 13 MCMS 2002 Migration Verbose CleanLockProcedure: start. 14 MCMS 2002 Migration Verbose CleanLockProcedure: end. 15 MCMS 2002 Migration High Migration throws exception: Line 6: Incorrect syntax near ';'.. Stacktrace: at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection) at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj) at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj) at System.Data.SqlClient.SqlCommand.RunExecuteNonQueryTds(String methodName, Boolean async) at System.Data.SqlClient.SqlCommand.InternalExecuteNonQuery(DbAsyncResult result, String methodName, Boolean sendToPipe) at System.Data.SqlClient.SqlCommand.ExecuteNonQuery() at Microsoft.SharePoint.Publishing.Internal.Administration... 16 MCMS 2002 Migration High ....MigrationBatchCommand.ExecuteImmediate(String command) at Microsoft.SharePoint.Publishing.Internal.Administration.MigrationBatchCommand.ExecuteWaitingCommands() at Microsoft.SharePoint.Publishing.Internal.Administration.MigrationDBSerializer.SerializeSelectedExportObject(StringCollection objectAttribs) at Microsoft.SharePoint.Publishing.Internal.Administration.MigrationDataAccess.InitializeExportTable(ScopeType scopeType) at Microsoft.SharePoint.Publishing.Internal.Administration.MigrationDataAccess.InitializeTemporaryTables(DateTime lastChangeTime) at Microsoft.SharePoint.Publishing.Internal.Administration.MigrationDataAccess.InitializeDatabase(DateTime lastChangeTime, Boolean isAnalysis, SqlConnection connection) at Microsoft.SharePoint.Publishing.Internal.Admin... 17 MCMS 2002 Migration High ...stration.MigrationDataAccess.InitializeDatabase(DateTime lastChangeTime, Boolean isAnalysis) at Microsoft.SharePoint.Publishing.Administration.ContentMigration.Export(MigrationDataAccess dataAccess) at Microsoft.SharePoint.Publishing.Administration.ContentMigration.MigrateInternal(). 18 MCMS 2002 Migration Verbose MigrationProfile: GetInstance. Start. 19 MCMS 2002 Migration Verbose MigrationProfile: GetInstance. End. 20 MCMS 2002 Migration Verbose Migration profile status is changed to Failed The stack trace of the failed parsing of the SQL command appear on lines 15-17. A cleaner version of the stack trace is appended below. Full Stack Trace: Migration throws exception: Line 6: Incorrect syntax near ';'.. at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection) at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning( TdsParserStateObject stateObj) at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj) at System.Data.SqlClient.SqlCommand.RunExecuteNonQueryTds(String methodName, Boolean async) at System.Data.SqlClient.SqlCommand.InternalExecuteNonQuery(DbAsyncResult result, String methodName, Boolean sendToPipe) at System.Data.SqlClient.SqlCommand.ExecuteNonQuery() at Microsoft.SharePoint.Publishing.Internal.Administration.MigrationBatchCommand .ExecuteImmediate(String command) at Microsoft.SharePoint.Publishing.Internal.Administration.MigrationBatchCommand .ExecuteWaitingCommands() at Microsoft.SharePoint.Publishing.Internal.Administration.MigrationDBSerializer .SerializeSelectedExportObject(StringCollection objectAttribs) at Microsoft.SharePoint.Publishing.Internal.Administration.MigrationDataAccess .InitializeExportTable(ScopeType scopeType) at Microsoft.SharePoint.Publishing.Internal.Administration.MigrationDataAccess .InitializeTemporaryTables(DateTime lastChangeTime) at Microsoft.SharePoint.Publishing.Internal.Administration.MigrationDataAccess .InitializeDatabase(DateTime lastChangeTime, Boolean isAnalysis, SqlConnection connection) at Microsoft.SharePoint.Publishing.Internal.Administration.MigrationDataAccess .InitializeDatabase(DateTime lastChangeTime, Boolean isAnalysis) at Microsoft.SharePoint.Publishing.Administration.ContentMigration.Export (MigrationDataAccess dataAccess) at Microsoft.SharePoint.Publishing.Administration.ContentMigration .MigrateInternal(). None of this log information indicates the SQL command that is failing a parser check. I've checked the SQL servers hosting the source and destination databases for a trace of the query, but neither seems to have triggered the parse failure condition. That appears to have happened on the SharePoint server. Are there any other locations I should investigate that might tell me where to find the source of the error?

    Read the article

  • Findbugs and comparing

    - by Rob Goodwin
    I recently started using the findbugs static analysis tool in a java build I was doing. The first report came back with loads of High Priority warnings. Being the obsessive type of person, I was ready to go knock them all out. However, I must be missing something. I get most of the warnings when comparing things. Such as the following code: public void setSpacesPerLevel(int value) { if( value >= 0) { ... produces a high priority warning at the if statement that reads. File: Indenter.java, Line: 60, Type: BIT_AND_ZZ, Priority: High, Category: CORRECTNESS Check to see if ((...) & 0) == 0 in sample.Indenter.setSpacesPerLevel(int) I am comparing an int to an int, seems like a common thing. I get quite a few of that type of error with similar simple comparisons. I have alot of other high priority warnings on what appears to be simple code blocks. Am I missing something here? I realize that static analysis can produce false positives, but the errors I am seeing seem too trivial of a case to be a false positive. This one has me scratching my head as well. for(int spaces = 0;spaces < spacesPerLevel;spaces++){... Which gives the following findbugs warning: File: Indenter.java, Line: 160, Type: IL_INFINITE_LOOP, Priority: High, Category: CORRECTNESS There is an apparent infinite loop in sample.Indenter.indent() This loop doesn't seem to have a way to terminate (other than by perhaps throwing an exception). Any ideas? So basically I have a handful of files and 50-60 high priority warnings similar to the ones above. I am using findbugs 1.3.9 and calling it from the findbugs ant task

    Read the article

  • Should I amortize scripting cost via bytecode analysis or multithreading?

    - by user18983
    I'm working on a game sort of thing where users can write arbitrary code for individual agents, and I'm trying to decide the best way to divide up computation time. The simplest option would be to give each agent a set amount of time and skip their turn if it elapses without an action being decided upon, but I would like people to be able to write their agents decision functions without having to think too much about how long its taking unless they really want to. The two approaches I'm considering are giving each agent a set number of bytecode instructions (taking cost into account) each timestep, and making players deal with the consequences of the game state changing between blocks of computation (as with Battlecode) or giving each agent it's own thread and giving each thread equal time on the processor. I'm about equally knowledgeable on both concurrency and bytecode stuff, which is to say not very, so I'm wondering which approach would be best. I have a clearer idea of how I'd structure things if I used bytecode, but less certainty about how to actually implement the analysis. I'm pretty sure I can work up a concurrency based system without much trouble, but I worry it will be messier with more overhead and will add unnecessary complexity to the project.

    Read the article

  • Performance comparison of Dictionaries

    - by Hun1Ahpu
    I'm interested in performance values (big-O analysis) of Lookup and Insert operation for .Net Dictionaries: HashTable, SortedList, StringDictionary, ListDictionary, HybridDictionary, NameValueCollection Link to a web page with the answer works for me too.

    Read the article

  • Non-linear regression models in PostgreSQL using R

    - by Dave Jarvis
    Background I have climate data (temperature, precipitation, snow depth) for all of Canada between 1900 and 2009. I have written a basic website and the simplest page allows users to choose category and city. They then get back a very simple report (without the parameters and calculations section): The primary purpose of the web application is to provide a simple user interface so that the general public can explore the data in meaningful ways. (A list of numbers is not meaningful to the general public, nor is a website that provides too many inputs.) The secondary purpose of the application is to provide climatologists and other scientists with deeper ways to view the data. (Using too many inputs, of course.) Tool Set The database is PostgreSQL with R (mostly) installed. The reports are written using iReport and generated using JasperReports. Poor Model Choice Currently, a linear regression model is applied against annual averages of daily data. The linear regression model is calculated within a PostgreSQL function as follows: SELECT regr_slope( amount, year_taken ), regr_intercept( amount, year_taken ), corr( amount, year_taken ) FROM temp_regression INTO STRICT slope, intercept, correlation; The results are returned to JasperReports using: SELECT year_taken, amount, year_taken * slope + intercept, slope, intercept, correlation, total_measurements INTO result; JasperReports calls into PostgreSQL using the following parameterized analysis function: SELECT year_taken, amount, measurements, regression_line, slope, intercept, correlation, total_measurements, execute_time FROM climate.analysis( $P{CityId}, $P{Elevation1}, $P{Elevation2}, $P{Radius}, $P{CategoryId}, $P{Year1}, $P{Year2} ) ORDER BY year_taken This is not an optimal solution because it gives the false impression that the climate is changing at a slow, but steady rate. Questions Using functions that take two parameters (e.g., year [X] and amount [Y]), such as PostgreSQL's regr_slope: What is a better regression model to apply? What CPAN-R packages provide such models? (Installable, ideally, using apt-get.) How can the R functions be called within a PostgreSQL function? If no such functions exist: What parameters should I try to obtain for functions that will produce the desired fit? How would you recommend showing the best fit curve? Keep in mind that this is a web app for use by the general public. If the only way to analyse the data is from an R shell, then the purpose has been defeated. (I know this is not the case for most R functions I have looked at so far.) Thank you!

    Read the article

  • Critiquing PHP-code / PerlCritic for PHP?

    - by jeekl
    I'm looking for an equivalent of PerlCritic for PHP. PerlCritc is a static source code analyzer that qritiques code and warns about everything from unused variables, to unsafe ways to handle data to almost anything. Is there such a thing for PHP that could (preferably) be run outside of an IDE, so that source code analysis could be automated?

    Read the article

  • Base 128 or 256 Encoding for the Binary Lexical Octet Adhoc Transport Protocol?

    - by Randolpho
    I'm in the process of implementing a network driver for the Binary Lexical Octet Adhoc Transport (BLOAT) protocols in the hopes of replacing the TCP/UDP/IP stack with a much more flexible XML structure. BLOAT is detailed in RFC 3252, so if you're unfamiliar with the protocol I highly recommend you read the entire RFC before providing any comments. Don't worry, it's short and sweet; you might even enjoy it. Anyway, my problem is this: BLOAT requires that the payload be Base64 encoded which doesn't make sense to me. I mean, sure, it's the internet standard for binary payloads, but there are better, more efficient encodings available: Base128 and Base256, for example. That the RFC requires Base64 and doesn't allow for any other payload encoding really bothers me. To that end, I'm considering a small optional change to the protocol. Embrace and extend, right? Anyway, I'd like to modify the payload element to accept an encoding attribute, which can extend the encoding to Base128 or Base256, or even to other encodings I can't conceive of at the moment. If the encoding attribute isn't present, Base64 would be assumed. So my question is this: should I? I mean, BLOAT is an accepted standard, even if it isn't exactly omnipresent. If I make this change, will there be compatibility issues? I don't foresee any, but perhaps you, oh great Stack Overflow Community, can? If I do implement this change, should I contact the original RFC author? Should I offer a supplemental RFC?

    Read the article

  • Analysis Services (SSAS) - Unexpected Internal Error when processing (ProcessUpdate). Workaround/Resolution

    - by James Rogers
    Many implementations require the use of ProcessUpdate to support Type 1 slowly changing dimensions. ProcessUpdate drops all of the affected indexes and aggregations in partitions affected by data that changes in the Dimension on which the ProcessUpdate is being performed. Twice now I have had situations where the processing fails with "Internal error: An unexpected exception occurred." Any subsequent ProcessUpdate processing will also fail with the same error. In talking with Microsoft the issue is corrupt indexes for the Dimension(s) being processed in the partitions of the affected measure group. I cannot guarantee that the following will correct your problem but it did in my case and saved us quite a bit of down time.   Workaround: ProcessIndexes on the entire cube that is being processed and throwing the error. This corrected the problem on both 2008 and 2008 R2.   Pros:  Does not require a complete rebuild of the data (ProcessFull) for either the Dimension or Cube. User access can continue while this ProcessIndexes in underway.   Cons: Can take a long time, especially on large cubes with many partitions, dimensions and/or aggregations. Query Performance is usually severely impacted due to the memory and CPU requirements for Aggregation and Index building   <Batch http://schemas.microsoft.com/analysisservices/2003/engine"http://schemas.microsoft.com/analysisservices/2003/engine">  <Parallel>     <Process xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:ddl2="http://schemas.microsoft.com/analysisservices/2003/engine/2" xmlns:ddl2_2="http://schemas.microsoft.com/analysisservices/2003/engine/2/2" xmlns:ddl100_100="http://schemas.microsoft.com/analysisservices/2008/engine/100/100" xmlns:ddl200="http://schemas.microsoft.com/analysisservices/2010/engine/200" xmlns:ddl200_200="http://schemas.microsoft.com/analysisservices/2010/engine/200/200">       <Object>         <DatabaseID>MyDatabase</DatabaseID>         <CubeID>MyCube</CubeID>       </Object>       <Type>ProcessIndexes</Type>       <WriteBackTableCreation>UseExisting</WriteBackTableCreation>     </Process>  </Parallel> </Batch>   The cube where the corruption exists can be found by having Profiler running while the ProcessUpdate is executing. The first partition that displays the "The Job has ended in failure." message in the TextData column will be part of the cube/measuregroup that has the corruption. You can try to run ProcessIndexes on just that measure group. This may correct the problem and save additional time if you have other large measure groups in the cube that are not affected by the corruption.   Remember to execute your normal ProcessUpdate batch after the successful completion of the ProcessIndexes. The ProcessIndexes does not pick up data changes.   Things that did not work: ProcessClearIndexes - why this doesn't work and ProcessIndexes does is unclear at this point. ProcessFull on the partition in question. In my latest case, this would clear up the problem for that partition. However, the next partition the ProcessUpdate touched that had data in it would generate and error. This leads me to believe the corruption problem will exist in all partitions in the affected measure group that have data in them.   NOTE: I experience this problem in both a SQL 2008 and SQL 2008 R2 Analysis Services environment, on separate built from the same relational database. This leads me to believe that some data condition in the tables used for the Dimension processing caused the corruption since the two environments were on physically separate hardware. I am waiting on Microsoft to analyze the dumps to give us more insight into what actually caused the corruption and will update this post accordingly.

    Read the article

  • Can you authenticate into SSAS with AD LDS (ADAM) accounts?

    - by Jaxidian
    I'm very new to AD LDS and experienced but not qualified with SSAS, so my apologies for my ignorances with these. We have a couple implementations where we expose SSAS via an HTTPS proxy (msmdpump.dll) and currently we have a temporary domain setup handling this (where our end-users have a second account+creds to manage because of this = non-ideal). I want to move us towards a more permanent solution which I'm thinking of moving all authentication to AD LDS for our web apps, SSAS, and others. However, SSAS is where I'm concerned about this. I know SSAS requires Windows Authentication and to play nicely, and that this ultimately means Active Directory will be involved. Is there a way to get this done with AD LDS instead of having to use a full AD DS implementation? If so, how? (Note: My question over at StackOverflow had a suggestion that I post this question here on ServerFault instead. My apologies if I'm not asking in the right forum.)

    Read the article

  • What consequences to take from what i read in logfiles?

    - by Helene Bilbo
    Since some weeks i manage my first Webserver, a Seaside application behind an Apache proxy on Linode, and i installed logwatch to send me daily logs. Where can i get information on when i have to act as a consequence of what i read in these logwatch reports? For example i read that all kinds of people try to login on funny nonexisting accounts or all kinds of webcrawlers test for nonexisting cms login pages, some ip adresses get banned and unbanned by fail2ban... I assume that's normal? Is it? But how do i know that i probably have to do something? What do i look for in the logs?

    Read the article

  • What is the best/easiest way to use scripts to analyze network traffic?

    - by yungin
    I'm looking to analyze packets via scripts. I'd like to use something high level. I'm in a mac/linux environment. I'm currently looking at different python+libpcap libraries. Perhaps lua+wireshark too. Maybe tcpdump+bash (but not sure that has a lot of info i can use). I also heard good things about scapy. Not sure. I'm wondering if you have any recommendations? There's quite a few of them out there. What have you found that works best? I'd definitely want something scriptable not something that I need to compile (like c/c++, etc)

    Read the article

  • Log application changes made to the system

    - by Maxim Veksler
    Hello, Windows 7, 64bit. I have an application which I don't trust but still need to run. I would like to run the installer of this application and later on the installed executable under some kind of "strace" for windows which will record what this application did to the system. Mainly: What files have been created / edited? What registery changed have been made? To what network hosts did the application tried to communicate? Ideally I would also be able to generate a "UNDO" action to undo all the changes. Please don't suggest full Virtualization solutions such as Virtualbox, VMWare and co. because the application should run in the host system (A "sandbox" approach will OTHO be accepted, IMHO). Do you any such utility I can use? Thank you, Maxim.

    Read the article

  • How expensive to run PC 24/7 or how to figure out how to determine it?

    - by jasondavis
    I realize this question is difficult to answer as it would be different based on users location, what there PC is doing and what hardware it consist of, along with other factors but I am hoping someone could give me a very rough estimate. I have always ran many PC's in my home 24/7 and I am just now looking at it from a money/cost of electric point of view. 1) I live in Central Florida. Can anyone guesstomate/estimate the avaerage monthly or daily cost of running your average PC? Intel quad core processor, 1 SSD drive for OS and programs and 4-5 1-2 TB hard drives in a RAID setup for data. 750watt PSU. What would your guess be? 2) Also is there an accurate way to figure this out (non-super technical and confusing to a non-math person please) Also I have seen those kill-a-watt devices, do they figure this kind of stuff out for you? 3) Does a larger PSU make your PC consume more power? Thanks for any help, you can most likely tell I am somewhat lost about this!

    Read the article

< Previous Page | 16 17 18 19 20 21 22 23 24 25 26 27  | Next Page >