Search Results

Search found 24731 results on 990 pages for 'corner case'.

Page 560/990 | < Previous Page | 556 557 558 559 560 561 562 563 564 565 566 567  | Next Page >

  • ASP.NET MVC The new ActionLink with MvcHtmlString and request URL generated

    - by mare
    This piece of code used to work in MVC 1 but it does not since I upgraded to MVC 2: <%=Html.ActionLink(Resources.Localize.Routes_WidgetsCreate, "Create" + "?modal=true", "Widget", null, new { rel = "shadowbox;height=600;width=700", title = Resources.Localize.Routes_WidgetsCreate })%> I am aware it has something to do with the way ActionLink encodes things, so the result that comes out is something like this: "http://localhost:53704/Widget/Create%3fmodal%3dtrue" The problem is, when clicked, the Shadowbox modal opens up and inside, where the request View should be rendered is this exception: Server Error in '/' Application. A potentially dangerous Request.Path value was detected from the client (?). What can I do to get past that? Do you recommend another way of sending params to the view besides in QueryString (in this case I need "modal" because in the view I select CSS styles based on whether we are rendering modal or not)?

    Read the article

  • JAVA Classes in Game programming.

    - by Gabriel A. Zorrilla
    I'm doing a little strategy game to help me learn Java in a fun way. The thing is I visioned the units as objects that would self draw on the game map (using images and buffering) and would react to the mouse actions with listeners attached to them. Now, based on some tutorials I've been reading regarding basic game programming, all seems to be drawn in the Graphics method of my Map class. If a new unit emerges, i just update the Map.Graphics method, it's not as easy as making a new Unit object which would self draw... In this case, I'd be stuck with a whole bunch of Map methods instead of using classes for rendering new things. So my question is, is it possible to use classes for rendering units, interface objects, etc, or i'll have to create methods and just do some kind of structural programming instead of object oriented? I'm a little bit confused and I'd like to have a mental blueprint of how things would be organized. Thanks!

    Read the article

  • zlib memory usage / performance. With 500kb of data.

    - by unixman83
    Is zLib Worth it? Are there other better suited compressors? I am using an embedded system. Frequently, I have only 3MB of RAM or less available to my application. So I am considering using zlib to compress my buffers. I am concerned about overhead however. The buffer's average size will be 30kb. This probably won't get compressed by zlib. Anyone know of a good compressor for extremely limited memory environments? However, I will experience occasional maximum buffer sizes of 700kb, with 500kb much more common. Is zlib worth it in this case? Or is the overhead too much to justify? My sole considerations for compression are RAM overhead of algorithm and performance at least as good as zlib.

    Read the article

  • Floating point arithmetic is too reliable.

    - by mcoolbeth
    I understand that floating point arithmetic as performed in modern computer systems is not always consistent with real arithmetic. I am trying to contrive a small C# program to demonstrate this. eg: static void Main(string[] args) { double x = 0, y = 0; x += 20013.8; x += 20012.7; y += 10016.4; y += 30010.1; Console.WriteLine("Result: "+ x + " " + y + " " + (x==y)); Console.Write("Press any key to continue . . . "); Console.ReadKey(true); } However, in this case, x and y are equal in the end. Is it possible for me to demonstrate the inconsistency of floating point arithmetic using a program of similar complexity, and without using any really crazy numbers? I would like, if possible, to avoid mathematically correct values that go more than a few places beyond the decimal point.

    Read the article

  • ASP.Net MVC Toggling IsAjaxRequest property based on file upload?

    - by Jon
    I have a form all setup to upload a file and that is working fine. However the way my form is submitted is through AJAX. The button that submits is still a type="submit" in case JS is off. When I save my form the controller determines whether the IsAjaxRequest is true and if so returns some JSON otherwise it does a RedirectToAction. When I don't specify a filepath in my input type="file" it considers IsAjaxRequest as true. If there is a filepath set then it thinks that IsAjaxRequest is false. How is it determining that? My other problem is that when it thinks IsAjaxRequest is false and does a RedirectToAction("Index") I don't actually get sent to the Index view. Thanks

    Read the article

  • Success verification of mail() function PHP

    - by Jared
    Hi Is it possible to check if the php can get some kind of a ping/flag back from exchange mail server to say "yes, email has been sent off to the intended recipient"? According to the PHP manual, the return of mail() boolean could mean; "It is important to note that just because the mail was accepted for delivery, it does NOT mean the mail will actually reach the intended destination." Does this mean, PHP can return success but actually there could be a problem on the mail server that php wouldn't know about it? and in this case no email has been sent and the user is none the wiser? TIA Jared

    Read the article

  • Document Stored in File System Text Searching and Filtering required in ASP .Net Application

    - by Harryboy
    Hello Experts, We are building a jobsite application in which we will store resumes of all the candidates, which is planned to store on file system. Now We need to search inside that file and provide the result to the user, we need to provide that what is the best solution to implement text searching. I have just tried to identify it and got some reference like IFilter (API or interface) and Lucene.Net (open source), but not sure that is it a right solution. In initial phase it is expected to be around 50,000 resumes and it should be scalable enough if number increases. I just want some case study or some analysis or your suggestions that which is the best method to handle this requirement (Technology ASP .Net) Thanks

    Read the article

  • pushing back an boost::ptr_vector<...>::iterator in another boost::ptr_vector?

    - by Ethan Nash
    Hi all, I have the following code (just typed it in here, might have typos or stuff): typedef boost::ptr_vector<SomeClass> tvec; tvec v; // ... fill v ... tvec vsnap; for(tvec::iterator it = v.begin(); it != v.end(); ++it) { if((*v).anyCondition) vsnap.push_back( it ); // (*it) or &(*it) doesn't work } My problem is now that i cant push_back an iterator in any way, I just don't get the pointer out of the iterator. Is there an easy way i didnt see, or are boosts ptr_vector the false choice for this case? Thanks in advance.

    Read the article

  • Use gmail domain account with IMAP authentication with SAML authentication not working...

    - by mscd000
    I have a python script that interfaces gmail accounts and allows searches, etc. This works on normal emails (ending on @gmail.com) but not on domain accounts. In this case authentication is done via SAML, and IMAP is enabled on the gmail domain account... The instructions from google on how to configure IMAP only seem to work for @gmail.com accounts... I've tried authentication to IMAP using user, user@admin and using host: imap.gmail.com as well as my domain's email and authentication is not working.... is there a specific 'host' from gmail for domain accounts? other way to get imap on gmail domain accounts? Thanks, Rodolfo

    Read the article

  • How to dynamically choose two fields from a Linq query as a result

    - by Dr. Zim
    If you have a simple Linq query like: var result = from record in db.Customer select new { Text = record.Name, Value = record.ID.ToString() }; which is returning an object that can be mapped to a Drop Down List, is it possible to dynamically specify which fields map to Text and Value? Of course, you could do a big case (switch) statement, then code each Linq query separately but this isn't very elegant. What would be nice would be something like: (pseudo code) var myTextField = db.Customer["Name"]; // Could be an enumeration?? var myValueField = db.Customer["ID"]; // Idea: choose the field outside the query var result = from record in db.Customer select new { Text = myTextField, Value = myValueField };

    Read the article

  • How to rotate 4 images, fading between each one?

    - by Darryl Hein
    I have 4 images, which I want to fade between each other in a loop. I have something like the following: <img src="/images/image-1.jpg" id="featureImg1" /> <img src="/images/image-2.jpg" id="featureImg2" style="display:none;" /> <img src="/images/image-3.jpg" id="featureImg3" style="display:none;" /> <img src="/images/image-4.jpg" id="featureImg4" style="display:none;" /> I am up for revisions to the HTML, although I cannot use absolute positioning in this case. I am using jQuery else where on the site, so it's available. I also need to deal with an image not being loaded right away as the images are larger.

    Read the article

  • How to make Visual Studio Pause after executing a console app in debug mode?

    - by Jason Dagit
    I have a collection of boost unit tests I want to run as a console application. When I'm working on the project and I run the tests I would like to be able to debug the tests and I would like to have the console stay open after the tests run. I see that if I run in release mode the console window stays up after the program exits, but in debug mode this is not the case. I do not want to add 'system("pause");' or any other hacks like reading a character to my program. I just want to make Visual Studio pause after running the tests with debugging like it would if I were running in release mode. I would also like it if the output of tests were captured in one of Visual Studio's output windows but that also seems to be harder than it should be. How can I do this? Thanks!

    Read the article

  • Selenium Grid not always using all of its registered RC's, why?

    - by BenA
    My Selenium Grid setup is as follows (all VMs) VM1 - Windows 7 x64 - Grid Hub + 2 RCs registering the default *firefox environment VM2 - Windows XP x32 - 2 RCs registering the default *firefox environment VM3 - Windows XP x32 - 2 RCs registering the default *firefox environment I'm happily using Mbunit and Gallio to drive the Grid, but my problem is that sometimes the Grid hub will stop passing executions over to 1 or more of the RCs, despite their showing available on the hub console. They seem to be happily maintaining their heartbeat back to the hub, but they're never asked to do any more work. This is after they had been executing tests earlier in the test run. Does anybody have any ideas why this should happen? In every case I've observed this behaviour, the last test an RC executed, before it then seemingly gets ignored by the hub, passed, and the session was successfully closed. Interestingly, whenever it happens to more than 1 of the RCs, its always (so far) been the pair that are running on the same VM. Yet they're managing to maintain their heartbeat, so it isn't a network connectivity problem. Any help would be greatly appreciated!

    Read the article

  • In a digital photo, detecting if a mountain is obscured by clouds.

    - by Gavin Brock
    The problem I have a collection of digital photos of a mountain in Japan. However the mountain is often obscured by clouds or fog. What techniques can I use to detect that the mountain is visible in the image? I am currently using Perl with the Imager module, but open to alternatives. All the images are taken from the exact same position - these are some samples. My naïve solution I started by taking several horizontal pixel samples of the mountain cone and comparing the brightness values to other samples from the sky. This worked well for differentiating good image 1 and bad image 2. However in the autumn it snowed and the mountain became brighter than the sky, like image 3, and my simple brightness test started to fail. Image 4 is an example of an edge case. I would classify this as a good image since some of the mountain is clearly visible.

    Read the article

  • postgres - ERROR: syntax error at or near "COST"

    - by cino21122
    EDIT Taking COST 100 out made the command go through, however, I'm still unable to run my query because it yields this error: ERROR: function group_concat(character) does not exist HINT: No function matches the given name and argument types. You may need to add explicit type casts. The query I'm running is this: select tpid, group_concat(z) as z, group_concat(cast(r as char(2))) as r, group_concat(to_char(datecreated,'DD-Mon-YYYY HH12:MI am')) as datecreated, group_concat(to_char(datemodified,'DD-Mon-YYYY HH12:MI am')) as datemodified from tpids group by tpid order by tpid, zip This function seems to work fine locally, but moving it online yields this error... Is there something I'm missing? CREATE OR REPLACE FUNCTION group_concat(text, text) RETURNS text AS $BODY$ SELECT CASE WHEN $2 IS NULL THEN $1 WHEN $1 IS NULL THEN $2 ELSE $1 operator(pg_catalog.||) ',' operator(pg_catalog.||) $2 END $BODY$ LANGUAGE 'sql' IMMUTABLE COST 100; ALTER FUNCTION group_concat(text, text) OWNER TO j76dd3;

    Read the article

  • Vertex buffer acting strange? [on hold]

    - by Ryan Capote
    I'm having a strange problem, and I don't know what could be causing it. My current code is identical to how I've done this before. I'm trying to render a rectangle using VBO and orthographic projection.   My results:     What I expect: 3x3 rectangle in the top left corner   #include <stdio.h> #include <GL\glew.h> #include <GLFW\glfw3.h> #include "lodepng.h"   static const int FALSE = 0; static const int TRUE = 1;   static const char* VERT_SHADER =     "#version 330\n"       "layout(location=0) in vec4 VertexPosition; "     "layout(location=1) in vec2 UV;"     "uniform mat4 uProjectionMatrix;"     /*"out vec2 TexCoords;"*/       "void main(void) {"     "    gl_Position = uProjectionMatrix*VertexPosition;"     /*"    TexCoords = UV;"*/     "}";   static const char* FRAG_SHADER =     "#version 330\n"       /*"uniform sampler2D uDiffuseTexture;"     "uniform vec4 uColor;"     "in vec2 TexCoords;"*/     "out vec4 FragColor;"       "void main(void) {"    /* "    vec4 texel = texture2D(uDiffuseTexture, TexCoords);"     "    if(texel.a <= 0) {"     "         discard;"     "    }"     "    FragColor = texel;"*/     "    FragColor = vec4(1.f);"     "}";   static int g_running; static GLFWwindow *gl_window; static float gl_projectionMatrix[16];   /*     Structures */ typedef struct _Vertex {     float x, y, z, w;     float u, v; } Vertex;   typedef struct _Position {     float x, y; } Position;   typedef struct _Bitmap {     unsigned char *pixels;     unsigned int width, height; } Bitmap;   typedef struct _Texture {     GLuint id;     unsigned int width, height; } Texture;   typedef struct _VertexBuffer {     GLuint bufferObj, vertexArray; } VertexBuffer;   typedef struct _ShaderProgram {     GLuint vertexShader, fragmentShader, program; } ShaderProgram;   /*   http://en.wikipedia.org/wiki/Orthographic_projection */ void createOrthoProjection(float *projection, float width, float height, float far, float near)  {       const float left = 0;     const float right = width;     const float top = 0;     const float bottom = height;          projection[0] = 2.f / (right - left);     projection[1] = 0.f;     projection[2] = 0.f;     projection[3] = -(right+left) / (right-left);     projection[4] = 0.f;     projection[5] = 2.f / (top - bottom);     projection[6] = 0.f;     projection[7] = -(top + bottom) / (top - bottom);     projection[8] = 0.f;     projection[9] = 0.f;     projection[10] = -2.f / (far-near);     projection[11] = (far+near)/(far-near);     projection[12] = 0.f;     projection[13] = 0.f;     projection[14] = 0.f;     projection[15] = 1.f; }   /*     Textures */ void loadBitmap(const char *filename, Bitmap *bitmap, int *success) {     int error = lodepng_decode32_file(&bitmap->pixels, &bitmap->width, &bitmap->height, filename);       if (error != 0) {         printf("Failed to load bitmap. ");         printf(lodepng_error_text(error));         success = FALSE;         return;     } }   void destroyBitmap(Bitmap *bitmap) {     free(bitmap->pixels); }   void createTexture(Texture *texture, const Bitmap *bitmap) {     texture->id = 0;     glGenTextures(1, &texture->id);     glBindTexture(GL_TEXTURE_2D, texture);       glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);     glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);     glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);     glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);       glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, bitmap->width, bitmap->height, 0,              GL_RGBA, GL_UNSIGNED_BYTE, bitmap->pixels);       glBindTexture(GL_TEXTURE_2D, 0); }   void destroyTexture(Texture *texture) {     glDeleteTextures(1, &texture->id);     texture->id = 0; }   /*     Vertex Buffer */ void createVertexBuffer(VertexBuffer *vertexBuffer, Vertex *vertices) {     glGenBuffers(1, &vertexBuffer->bufferObj);     glGenVertexArrays(1, &vertexBuffer->vertexArray);     glBindVertexArray(vertexBuffer->vertexArray);       glBindBuffer(GL_ARRAY_BUFFER, vertexBuffer->bufferObj);     glBufferData(GL_ARRAY_BUFFER, sizeof(Vertex) * 6, (const GLvoid*)vertices, GL_STATIC_DRAW);       const unsigned int uvOffset = sizeof(float) * 4;       glVertexAttribPointer(0, 4, GL_FLOAT, GL_FALSE, sizeof(Vertex), 0);     glVertexAttribPointer(1, 2, GL_FLOAT, GL_FALSE, sizeof(Vertex), (GLvoid*)uvOffset);       glEnableVertexAttribArray(0);     glEnableVertexAttribArray(1);       glBindBuffer(GL_ARRAY_BUFFER, 0);     glBindVertexArray(0); }   void destroyVertexBuffer(VertexBuffer *vertexBuffer) {     glDeleteBuffers(1, &vertexBuffer->bufferObj);     glDeleteVertexArrays(1, &vertexBuffer->vertexArray); }   void bindVertexBuffer(VertexBuffer *vertexBuffer) {     glBindVertexArray(vertexBuffer->vertexArray);     glBindBuffer(GL_ARRAY_BUFFER, vertexBuffer->bufferObj); }   void drawVertexBufferMode(GLenum mode) {     glDrawArrays(mode, 0, 6); }   void drawVertexBuffer() {     drawVertexBufferMode(GL_TRIANGLES); }   void unbindVertexBuffer() {     glBindVertexArray(0);     glBindBuffer(GL_ARRAY_BUFFER, 0); }   /*     Shaders */ void compileShader(ShaderProgram *shaderProgram, const char *vertexSrc, const char *fragSrc) {     GLenum err;     shaderProgram->vertexShader = glCreateShader(GL_VERTEX_SHADER);     shaderProgram->fragmentShader = glCreateShader(GL_FRAGMENT_SHADER);       if (shaderProgram->vertexShader == 0) {         printf("Failed to create vertex shader.");         return;     }       if (shaderProgram->fragmentShader == 0) {         printf("Failed to create fragment shader.");         return;     }       glShaderSource(shaderProgram->vertexShader, 1, &vertexSrc, NULL);     glCompileShader(shaderProgram->vertexShader);     glGetShaderiv(shaderProgram->vertexShader, GL_COMPILE_STATUS, &err);       if (err != GL_TRUE) {         printf("Failed to compile vertex shader.");         return;     }       glShaderSource(shaderProgram->fragmentShader, 1, &fragSrc, NULL);     glCompileShader(shaderProgram->fragmentShader);     glGetShaderiv(shaderProgram->fragmentShader, GL_COMPILE_STATUS, &err);       if (err != GL_TRUE) {         printf("Failed to compile fragment shader.");         return;     }       shaderProgram->program = glCreateProgram();     glAttachShader(shaderProgram->program, shaderProgram->vertexShader);     glAttachShader(shaderProgram->program, shaderProgram->fragmentShader);     glLinkProgram(shaderProgram->program);          glGetProgramiv(shaderProgram->program, GL_LINK_STATUS, &err);       if (err != GL_TRUE) {         printf("Failed to link shader.");         return;     } }   void destroyShader(ShaderProgram *shaderProgram) {     glDetachShader(shaderProgram->program, shaderProgram->vertexShader);     glDetachShader(shaderProgram->program, shaderProgram->fragmentShader);       glDeleteShader(shaderProgram->vertexShader);     glDeleteShader(shaderProgram->fragmentShader);       glDeleteProgram(shaderProgram->program); }   GLuint getUniformLocation(const char *name, ShaderProgram *program) {     GLuint result = 0;     result = glGetUniformLocation(program->program, name);       return result; }   void setUniformMatrix(float *matrix, const char *name, ShaderProgram *program) {     GLuint loc = getUniformLocation(name, program);       if (loc == -1) {         printf("Failed to get uniform location in setUniformMatrix.\n");         return;     }       glUniformMatrix4fv(loc, 1, GL_FALSE, matrix); }   /*     General functions */ static int isRunning() {     return g_running && !glfwWindowShouldClose(gl_window); }   static void initializeGLFW(GLFWwindow **window, int width, int height, int *success) {     if (!glfwInit()) {         printf("Failed it inialize GLFW.");         *success = FALSE;        return;     }          glfwWindowHint(GLFW_RESIZABLE, 0);     *window = glfwCreateWindow(width, height, "Alignments", NULL, NULL);          if (!*window) {         printf("Failed to create window.");         glfwTerminate();         *success = FALSE;         return;     }          glfwMakeContextCurrent(*window);       GLenum glewErr = glewInit();     if (glewErr != GLEW_OK) {         printf("Failed to initialize GLEW.");         printf(glewGetErrorString(glewErr));         *success = FALSE;         return;     }       glClearColor(0.f, 0.f, 0.f, 1.f);     glViewport(0, 0, width, height);     *success = TRUE; }   int main(int argc, char **argv) {          int err = FALSE;     initializeGLFW(&gl_window, 480, 320, &err);     glDisable(GL_DEPTH_TEST);     if (err == FALSE) {         return 1;     }          createOrthoProjection(gl_projectionMatrix, 480.f, 320.f, 0.f, 1.f);          g_running = TRUE;          ShaderProgram shader;     compileShader(&shader, VERT_SHADER, FRAG_SHADER);     glUseProgram(shader.program);     setUniformMatrix(&gl_projectionMatrix, "uProjectionMatrix", &shader);       Vertex rectangle[6];     VertexBuffer vbo;     rectangle[0] = (Vertex){0.f, 0.f, 0.f, 1.f, 0.f, 0.f}; // Top left     rectangle[1] = (Vertex){3.f, 0.f, 0.f, 1.f, 1.f, 0.f}; // Top right     rectangle[2] = (Vertex){0.f, 3.f, 0.f, 1.f, 0.f, 1.f}; // Bottom left     rectangle[3] = (Vertex){3.f, 0.f, 0.f, 1.f, 1.f, 0.f}; // Top left     rectangle[4] = (Vertex){0.f, 3.f, 0.f, 1.f, 0.f, 1.f}; // Bottom left     rectangle[5] = (Vertex){3.f, 3.f, 0.f, 1.f, 1.f, 1.f}; // Bottom right       createVertexBuffer(&vbo, &rectangle);            bindVertexBuffer(&vbo);          while (isRunning()) {         glClear(GL_COLOR_BUFFER_BIT);         glfwPollEvents();                    drawVertexBuffer();                    glfwSwapBuffers(gl_window);     }          unbindVertexBuffer(&vbo);       glUseProgram(0);     destroyShader(&shader);     destroyVertexBuffer(&vbo);     glfwTerminate();     return 0; }

    Read the article

  • Laissez les bon temps rouler! (Microsoft BI Conference 2010)

    - by smisner
    Laissez les bons temps rouler" is a Cajun phrase that I heard frequently when I lived in New Orleans in the mid-1990s. It means "Let the good times roll!" and encapsulates a feeling of happy expectation. As I met with many of my peers and new acquaintances at the Microsoft BI Conference last week, this phrase kept running through my mind as people spoke about their plans in their respective businesses, the benefits and opportunities that the recent releases in the BI stack are providing, and their expectations about the future of the BI stack.Notwithstanding some jabs here and there to point out the platform is neither perfect now nor will be anytime soon (along with admissions that the competitors are also not perfect), and notwithstanding several missteps by the event organizers (which I don't care to enumerate), the overarching mood at the conference was positive. It was a refreshing change from the doom and gloom hovering over several conferences that I attended in 2009. Although many people expect economic hardships to continue over the coming year or so, everyone I know in the BI field is busier than ever and expects to stay busy for quite a while.Self-Service BISelf-service was definitely a theme of the BI conference. In the keynote, Ted Kummert opened with a look back to a fairy tale vision of self-service BI that he told in 2008. At that time, the fairy tale future was a time when "every end user was able to use BI technologies within their job in order to move forward more effectively" and transitioned to the present time in which SQL Server 2008 R2, Office 2010, and SharePoint 2010 are available to deliver managed self-service BI.This set of technologies is presumably poised to address the needs of the 80% of users that Kummert said do not use BI today. He proceeded to outline a series of activities that users ought to be able to do themselves--from simple changes to a report like formatting or an addtional data visualization to integration of an additional data source. The keynote then continued with a series of demonstrations of both current and future technology in support of self-service BI. Some highlights that interested me:PowerPivot, of course, is the flagship product for self-service BI in the Microsoft BI stack. In the TechEd keynote, which was open to the BI conference attendees, Amir Netz (twitter) impressed the audience by demonstrating interactivity with a workbook containing 100 million rows. He upped the ante at the BI keynote with his demonstration of a future-state PowerPivot workbook containing over 2 billion records. It's important to note that this volume of data is being processed by a server engine, and not in the PowerPivot client engine. (Yes, I think it's impressive, but none of my clients are typically wrangling with 2 billion records at a time. Maybe they're thinking too small. This ability to work quickly with large data sets has greater implications for BI solutions than for self-service BI, in my opinion.)Amir also demonstrated KPIs for the future PowerPivot, which appeared to be easier to implement than in any other Microsoft product that supports KPIs, apart from simple KPIs in SharePoint. (My initial reaction is that we have one more place to build KPIs. Great. It's confusing enough. I haven't seen how well those KPIs integrate with other BI tools, which will be important for adoption.)One more PowerPivot feature that Amir showed was a graphical display of the lineage for calculations. (This is hugely practical, especially if you build up calculations incrementally. You can more easily follow the logic from calculation to calculation. Furthermore, if you need to make a change to one calculation, you can assess the impact on other calculations.)Another product demonstration will be available within the next 30 days--Pivot for Reporting Services. If you haven't seen this technology yet, check it out at www.getpivot.com. (It definitely has a wow factor, but I'm skeptical about its practicality. However, I'm looking forward to trying it out with data that I understand.)Michael Tejedor (twitter) demonstrated a feature that I think is really interesting and not emphasized nearly enough--overshadowed by PowerPivot, no doubt. That feature is the Microsoft Business Intelligence Indexing Connector, which enables search of the content of Excel workbooks and Reporting Services reports. (This capability existed in MOSS 2007, but was more cumbersome to implement. The search results in SharePoint 2010 are not only cooler, but more useful by describing whether the content is found in a table or a chart, for example.)This may yet be the dawning of the age of self-service BI - a phrase I've heard repeated from time to time over the last decade - but I think BI professionals are likely to stay busy for a long while, and need not start looking for a new line of work. Kummert repeatedly referenced strategic BI solutions in contrast to self-service BI to emphasize that self-service BI is not a replacement for the services that BI professionals provide. After all, self-service BI does not appear magically on user desktops (or whatever device they want to use). A supporting infrastructure is necessary, and grows in complexity in proportion to the need to simplify BI for users.It's one thing to hear the party line touted by Microsoft employees at the BI keynote, but it's another to hear from the people who are responsible for implementing and supporting it within an organization. Rob Collie (blog | twitter), Kasper de Jonge (blog | twitter), Vidas Matelis (site | twitter), and I were invited to join Andrew Brust (blog | twitter) as he led a Birds of a Feather session at TechEd entitled "PowerPivot: Is It the BI Deal-Changer for Developers and IT Pros?" I would single out the prevailing concern in this session as the issue of control. On one side of this issue were those who were concerned that they would lose control once PowerPivot is implemented. On the other side were those who believed that data should be freely accessible to users in PowerPivot, and even acknowledgment that users would get the data they want even if it meant they would have to manually enter into a workbook to have it ready for analysis. For another viewpoint on how PowerPivot played out at the conference, see Rob Collie's observations.Collaborative BII have been intrigued by the notion of collaborative BI for a very long time. Before I discovered BI, I was a Lotus Notes developer and later a manager of developers, working in a software company that enabled collaboration in the legal industry. Not only did I help create collaborative systems for our clients, I created a complete project management from the ground up to collaboratively manage our custom development work. In that case, collaboration involved my team, my client contacts, and me. I was also able to produce my own BI from that system as well, but didn't know that's what I was doing at the time. Only in recent years has SharePoint begun to catch up with the capabilities that I had with Lotus Notes more than a decade ago. Eventually, I had the opportunity at that job to formally investigate BI as another product offering for our software, and the rest - as they say - is history. I built my first data warehouse with Scott Cameron (who has also ventured into the authoring world by writing Analysis Services 2008 Step by Step and was at the BI Conference last week where I got to reminisce with him for a bit) and that began a career that I never imagined at the time.Fast forward to 2010, and I'm still lauding the virtues of collaborative BI, if only the tools will catch up to my vision! Thus, I was anxious to see what Donald Farmer (blog | twitter) and Rita Sallam of Gartner had to say on the subject in their session "Collaborative Decision Making." As I suspected, the tools aren't quite there yet, but the vendors are moving in the right direction. One thing I liked about this session was a non-Microsoft perspective of the state of the industry with regard to collaborative BI. In addition, this session included a better demonstration of SharePoint collaborative BI capabilities than appeared in the BI keynote. Check out the video in the link to the session to see the demonstration. One of the use cases that was demonstrated was linking from information to a person, because, as Donald put it, "People don't trust data, they trust people."The Microsoft BI Stack in GeneralA question I hear all the time from students when I'm teaching is how to know what tools to use when there is overlap between products in the BI stack. I've never taken the time to codify my thoughts on the subject, but saw that my friend Dan Bulos provided good insight on this topic from a variety of perspectives in his session, "So Many BI Tools, So Little Time." I thought one of his best points was that ideally you should be able to design in your tool of choice, and then deploy to your tool of choice. Unfortunately, the ideal is yet to become real across the platform. The closest we come is with the RDL in Reporting Services which can be produced from two different tools (Report Builder or Business Intelligence Development Studio's Report Designer), manually, or by a third-party or custom application. I have touted the idea for years (and publicly said so about 5 years ago) that eventually more products would be RDL producers or consumers, but we aren't there yet. Maybe in another 5 years.Another interesting session that covered the BI stack against a backdrop of competitive products was delivered by Andrew Brust. Andrew did a marvelous job of consolidating a lot of information in a way that clearly communicated how various vendors' offerings compared to the Microsoft BI stack. He also made a particularly compelling argument about how the existence of an ecosystem around the Microsoft BI stack provided innovation and opportunities lacking for other vendors. Check out his presentation, "How Does the Microsoft BI Stack...Stack Up?"Expo HallI had planned to spend more time in the Expo Hall to see who was doing new things with the BI stack, but didn't manage to get very far. Each time I set out on an exploratory mission, I got caught up in some fascinating conversations with one or more of my peers. I find interacting with people that I meet at conferences just as important as attending sessions to learn something new. There were a couple of items that really caught me eye, however, that I'll share here.Pragmatic Works. Whether you develop SSIS packages, build SSAS cubes, or author SSRS reports (or all of the above), you really must take a look at BI Documenter. Brian Knight (twitter) walked me through the key features, and I must say I was impressed. Once you've seen what this product can do, you won't want to document your BI projects any other way. You can download a free single-user database edition, or choose from more feature-rich standard or professional editions.Microsoft Press ebooks. I also stopped by the O'Reilly Media booth to meet some folks that one of my acquisitions editors at Microsoft Press recommended. In case you haven't heard, Microsoft Press has partnered with O'Reilly Media for distribution and publishing. Apart from my interest in learning more about O'Reilly Media as an author, an advertisement in their booth caught me eye which I think is a really great move. When you buy Microsoft Press ebooks through the O'Reilly web site, you can receive it in any (or all) of the following formats where possible: PDF, epub, .mobi for Kindle and .apk for Android. You also have lifetime DRM-free access to the ebooks. As someone who is an avid collector of books, I fnd myself running out of room for storage. In addition, I travel a lot, and it's hard to lug my reference library with me. Today's e-reader options make the move to digital books a more viable way to grow my library. Having a variety of formats means I am not limited to a single device, and lifetime access means I don't have to worry about keeping track of where I've stored my files. Because the e-books are DRM-free, I can copy and paste when I'm compiling notes, and I can print pages when necessary. That's a winning combination in my mind!Overall, I was pleased with the BI conference. There were many more sessions that I couldn't attend, either because the room was full when I got there or there were multiple sessions running concurrently that I wanted to see. Fortunately, many of the sessions are accessible for viewing online at http://www.msteched.com/2010/NorthAmerica along with the TechEd sessions. You can spot the BI sessions by the yellow skyline on the title slide of the presentation as shown below. Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Rx: Piecing together multiple IObservable web requests

    - by McLovin
    Hello, I'm creating multiple asynchronous web requests using IObservables and reactive extensions. So this creates observable for "GET" web request: var tweetObservalue = from request in WebRequestExtensions.CreateWebRequest(outUrl + querystring, method) from response in request.GetResponseAsync() let responseStream = response.GetResponseStream() let reader = new StreamReader(responseStream) select reader.ReadToEnd(); And I can do tweetObservable.Subscribe(response => dosomethingwithresponse(response)); What is the correct way of executing multiple asynchronous web requests with IObservables and LINQ that have to wait until other requests have been finished? For example first I would like to verify user info: create userInfoObservable, then if user info is correct I want to update stats so I get updateStatusObservable then if status is updated I would like create friendshipObservable and so on. Also bonus question, there is a case where I would like to execute web calls simultaneously and when all are finished execute another observable which will until other calls are finished. Thank you.

    Read the article

  • How powerful is the <script> tag in ASP.NET ?

    - by MarceloRamires
    I'm new at web development with .NET, and I'm currently studying a page where I have both separated codebehinds (in my case, a .CS file associated to the ASPX file), and codebehind that is inside the ASPX file inside tags like this: <script runat="server"> //code </script> Q1:What is the main difference (besides logical matters like organization, readability and ETC), what could be done in one way that could not be done in another? What is each mode best suited for ? Q2:If I'm going to develop a simple page with database connection, library imports, access to controls (ascx) and image access in other folders.. which method should I choose ?

    Read the article

  • Purpose of dereferencing a pointer as a parameter in C.

    - by Leif Andersen
    I recently came along this line of code: CustomData_em_free_block(&em->vdata, &eve->data); And I thought, isn't: a->b just syntactic sugar for: (*a).b With that in mind, this line could be re-written as: CustomData_em_free_block(&(*em).vdata, &(*eve).data); If that's the case, what is the point of passing in &(*a), as a parameter, and not just a? It seems like the pointer equivalent of -(-a) is being passed in in, is there any logic for this? Thank you.

    Read the article

  • How to find a word within text using XSLT 2.0 and REGEX (which doesn't have \b word boundary)?

    - by Mads Hansen
    I am attempting to scan a string of words and look for the presence of a particular word(case insensitive) in an XSLT 2.0 stylesheet using REGEX. I have a list of words that I wish to iterate over and determine whether or not they exist within a given string. I want to match on a word anywhere within the given text, but I do not want to match within a word (i.e. A search for foo should not match on "food" and a search for bar should not match on "rebar"). XSLT 2.0 REGEX does not have a word boundary(\b), so I need to replicate it as best I can.

    Read the article

  • jQuery UI Datepicker on a qTip

    - by Justin Ethier
    I am trying to display a qTip containing a jQuery UI datepicker control (the version bundled with jQuery UI). However the datepicker's calendar opens behind the qTip. I tried manually setting the calendar's z-order from firebug, which does allow the calendar to open in front of the qTip. However, in this case clicking on the calendar has the effect of closing the qTip as (I assume) it is part of the page's content. I am still working through this but wanted to ask - has anyone run into this problem before? Any possible workarounds to get the datepicker to work?

    Read the article

  • Need to call COM component using reflection in .NET

    - by Usman
    I need to determine the COM component(unmanaged code) type and invoke the exposed interface's methods using reflection in C# at runtime. First What member of "Type" tells that type is COM component and we can take CLSID at runtime? Is Type.COMObject? I need to call methods of exposed interfaces as they called in unmanaged code using CoCreateInstance by passing CLSID and REFID ... I am using InvokeMember but it returns null or 0 as out parameter. How to pass out parameter in this case.? Is there any need to pass out parameter? As all my COM unmanaged code suppose to take last parameter as an OUT parameter and after executing it puts the result into that out param. But I've converted all my unmanaged COM code to .NET managed assemblies using tlbimp.exe.

    Read the article

  • How to generate entity classes from nhibernate mapping files during runtime.

    - by Denis Rosca
    Hello, i need some help with c# and nhibernate. I'm working on a project that requires the entity classes to be generated from hbm files at runtime. I get the mapping files from a service, and then need to generate the classes dynamicaly and configure nhibernate to use them. The problem is that i'm new to nhibernate and not much of a pro in c#, so me writing the piece of code that achieves this is very error-prone. I was wondering if you know of any open source software that i could use. Worst case scenario (if can't find anything that even remotely resembles what i need), do you guys have some advice on where should i start? Maybe some links ? Thanks, Denis.

    Read the article

  • php curly braces groups

    - by David
    is there a function or regex or anything to group items by a { and } tag? so it should group items by the opening tag { and the closing tag }. but be careful, there are also groups inside parent groups like so: group { text1 group2 { text2 } } so basically think of it like php, you make an opening tag, you need to close it with a closing tag, curly braces in php's case. i just need it to like substr() each group into an associative array somehow, except I cant figure it out with the whole, "group inside a parent group".

    Read the article

< Previous Page | 556 557 558 559 560 561 562 563 564 565 566 567  | Next Page >