In the class of open methods, the error bounds are optimal with respect to the order of magnitude of the number of sample nodes. Furthermore we obtain conditions on the coordinate weights under which the error bounds are independent of the dimension s. In terms of the field of Information-Based Complexity this means that the corresponding QMC rule achieves a strong polynomial tractability error bound. Our findings on the RMS worst-case error of randomized Halton sequences can be carried over to the RMS L2-discrepancy.
Except for the -adic shift our results are fully constructive and no search algorithms (such as the component-by-component algorithm) are required.